[go: up one dir, main page]

CN211890820U - Air-ground cooperative intelligent inspection robot - Google Patents

Air-ground cooperative intelligent inspection robot Download PDF

Info

Publication number
CN211890820U
CN211890820U CN202020466444.5U CN202020466444U CN211890820U CN 211890820 U CN211890820 U CN 211890820U CN 202020466444 U CN202020466444 U CN 202020466444U CN 211890820 U CN211890820 U CN 211890820U
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
robot
air
power supply
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202020466444.5U
Other languages
Chinese (zh)
Inventor
林立民
邓若愚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji Institute Of Artificial Intelligence Suzhou Co ltd
Original Assignee
Tongji Institute Of Artificial Intelligence Suzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji Institute Of Artificial Intelligence Suzhou Co ltd filed Critical Tongji Institute Of Artificial Intelligence Suzhou Co ltd
Priority to CN202020466444.5U priority Critical patent/CN211890820U/en
Application granted granted Critical
Publication of CN211890820U publication Critical patent/CN211890820U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The utility model relates to an air-ground collaborative intelligence patrols and examines robot, including robot platform, unmanned aerial vehicle, robot platform includes the automobile body, sets up at the wheel and the drive assembly of automobile body bottom, sets up on the automobile body: robotic arm, environment perception subassembly, communicator, robot control ware and power supply module, the communicator realizes communication connection to unmanned aerial vehicle and basic station. The utility model discloses build all-weather autonomous navigation multiclass robot platform, guarantee that the robot can be all-round, all-weather carry out given navigation and patrol and examine the task, it utilizes the thing networking to synthesize a section, artificial intelligence, cloud computing, techniques such as big data, integrated environment perception, dynamic decision, behavior control etc. possess autonomous perception, independently walk, independently protect, ability such as interactive interchange, can help the mankind to accomplish basic nature, repeatability, dangerous security work, promote security service upgrade, reduce the multi-functional intelligent of synthesizing of security operation cost and equip.

Description

Air-ground cooperative intelligent inspection robot
Technical Field
The utility model belongs to the technical field of the robot, concretely relates to air-ground collaborative intelligence patrols and examines robot.
Background
In recent years, major explosion accidents in the chemical industry are mainly caused by leakage of dangerous chemicals, and the main reason for the leakage accidents is that the inspection force is insufficient. In order to reduce the incidence of safety accidents in the chemical industry, inspection work becomes one of indispensable works in the chemical industry. At present, the overhaul and inspection operation is mainly completed manually, and due to the fact that the quality levels of personnel are different, the problems that safety consciousness is weak, safety responsibility is not realized, safety supervision work is not in place and the like generally exist, and the reliability and accuracy of inspection work are difficult to guarantee. In addition, the chemical industry usually occupies a large area, and comprises a large number of pressure containers and pressure conveying pipelines of nearly one hundred kilometers, so that the working environment of inspection operation is very complicated, and all inspection tasks are difficult to complete only by manpower.
With the advance of intelligent operation and maintenance, the inspection requirements of the robot are increasing day by day, and the chemical industry gradually uses the robot to replace manual inspection, so that the labor cost can be greatly reduced, and the inspection efficiency and reliability can be ensured. Although there are many advantages to apply the robot to the safety inspection of chemical industry, the inspection robot at present is generally the inspection equipment of single robot type, and intelligent degree is not high moreover, including unmanned vehicle and unmanned aerial vehicle.
The robot is patrolled and examined in present stage includes unmanned vehicle and unmanned aerial vehicle, mainly has following not enough:
1. the method lacks a multi-robot cooperation technology of unmanned vehicles and unmanned planes:
the single unmanned vehicle inspection equipment is limited by the working principle of the single unmanned vehicle inspection equipment, has high requirement on the flatness of the road surface, and cannot perform the inspection work on rough road surfaces, stair climbing and high altitude environments; the single unmanned aerial vehicle inspection equipment is limited by cruising ability, has limited flight time and load capacity, cannot carry various sensors, and cannot fly remotely.
2. Lack the autonomous mobile platform that can carry on unmanned aerial vehicle:
when the inspection unmanned aerial vehicle performs tasks, the operators are still required to take the inspection unmanned aerial vehicle to a designated place through a vehicle, and the inspection mode still has the defects of low efficiency, waste of manpower and material resources and the like.
3. Lack of autonomy:
when the unmanned aerial vehicle at the present stage executes the inspection task, the unmanned aerial vehicle is still required to be remotely controlled by an operator on site to carry out close-range image acquisition work on a chemical environment and equipment, the inspection mode is lack of autonomy, and the inspection efficiency is low.
4. Lack of explosion-proof properties:
unmanned vehicles and unmanned aerial vehicles at the present stage do not usually have explosion-proof performance, and high-speed brushless motors and high-energy batteries generate ultrahigh current, so that once the vehicles and the unmanned aerial vehicles are contacted with combustible gas, the vehicles and the unmanned aerial vehicles explode, and potential hidden dangers such as fire and explosion cannot imagine the consequences of the vehicles and the unmanned aerial vehicles.
Disclosure of Invention
The utility model aims at providing an air-ground collaborative type intelligence patrols and examines robot is applied to the safety of chemical industry and patrols and examines work.
In order to achieve the above purpose, the utility model adopts the technical scheme that:
the utility model provides an air-ground collaborative intelligence patrols and examines robot, includes robot platform, unmanned aerial vehicle, robot platform include the automobile body, set up automobile body bottom's wheel and drive assembly, set up and be in the automobile body on: robotic arm, environment perception subassembly, communicator, robot control ware and power supply module, the communicator right unmanned aerial vehicle and base station realize communication connection.
Preferably, the robot arm comprises a base arranged on the vehicle body, a joint assembly rotatably connected to the base, a mechanical claw rotatably connected to the joint assembly, and a rotary driving member for driving each component to rotate, wherein the joint assembly comprises one or more connecting joints which are rotatably connected end to end in sequence.
Further preferably, the joint assembly includes a first connecting joint rotatably connected to the base, a second connecting joint rotatably connected to the first connecting joint, a third connecting joint rotatably connected to the second connecting joint, a fourth connecting joint rotatably connected to the third connecting joint, and a fifth connecting joint rotatably connected to the fourth connecting joint, and the gripper is rotatably connected to the fifth connecting joint.
Further preferably, the mechanical gripper comprises a gripper body rotatably connected to the joint assembly, a pair of grippers connected to the gripper body, and a gripper driving assembly for driving the grippers to perform gripping operation, wherein the gripper driving assembly comprises a worm provided on the gripper body, a worm wheel connected to one end of the gripper and engaged with the worm, and a gripper driving member for driving the worm to rotate.
Preferably, the robot platform still including set up the automobile body on supply unmanned aerial vehicle take off and land and the platform main part that charges, the platform main part with power supply module be connected.
Preferably, the environment sensing assembly comprises a laser radar, a sensor assembly and a camera and photograph assembly.
Further preferably, the sensor assembly includes a gas concentration sensor and a humidity-temperature sensor.
Further preferably, the camera shooting and photographing component comprises a visible light high-definition camera, an infrared camera and a monocular camera.
Preferably, the robot platform further comprises a touch screen display arranged on the vehicle body.
Further preferably, a microphone and a speaker are integrated in the touch screen display.
Preferably, the communicator is internally integrated with inertial navigation equipment and GPS equipment.
Preferably, unmanned aerial vehicle include fuselage, unmanned aerial vehicle control assembly, set up the fuselage on: undercarriage, screw group, unmanned aerial vehicle drive and power supply module and make a video recording and sensing component.
Further preferably, the unmanned aerial vehicle control assembly comprises a flight control module, a data transmission module and a graph transmission module, wherein the flight control module comprises a flight control sealing box arranged on the airframe, a flight controller arranged in the flight control sealing box, a power supply manager and a parameter adjusting interface, and the flight controller is respectively connected with the power supply manager and the parameter adjusting interface; the data transmission and image transmission module comprises a data transmission and image transmission sealing box arranged on the airplane body, a data transmission and image transmission unmanned aerial vehicle end arranged in the data transmission and image transmission sealing box, and a data transmission and image transmission ground end connected with the data transmission and image transmission unmanned aerial vehicle end, wherein the data transmission and image transmission unmanned aerial vehicle end is connected with the flight controller.
Further preferably, the unmanned aerial vehicle drive and power supply assembly comprises a power supply explosion-proof box, a battery arranged in the power supply explosion-proof box, an electronic speed regulator, a concentrator, a motor base and a motor arranged on the motor base, wherein the battery is connected with the power supply manager, the battery is connected with the electronic speed regulator through the concentrator, the input end of the electronic speed regulator is connected with the parameter adjusting interface, and the output end of the electronic speed regulator is connected with the motor.
Further preferably, the camera and sensor assembly includes a camera, a sensor sealing box, a sensor control board disposed in the sensor sealing box, and a gas sensor connected to the sensor control board.
Further preferably, the propeller groups are provided with four groups, each group of propeller groups is provided with two propellers, and the two propellers are arranged up and down.
Because of the application of the technical scheme, compared with the prior art, the utility model has the following advantages:
the utility model discloses build all-weather autonomous navigation multiclass robot platform, guarantee that the robot can be all-round, all-weather carry out given navigation and patrol and examine the task, it utilizes the thing networking to synthesize a section, artificial intelligence, cloud computing, techniques such as big data, integrated environment perception, dynamic decision, behavior control etc. possess autonomous perception, independently walk, independently protect, ability such as interactive interchange, can help the mankind to accomplish basic nature, repeatability, dangerous security work, promote security service upgrade, reduce the multi-functional intelligent of synthesizing of security operation cost and equip.
Drawings
FIG. 1 is a schematic structural diagram of the present embodiment;
FIG. 2 is a schematic structural diagram of the robot platform (blanking platform main body) in this embodiment;
FIG. 3 is a schematic structural diagram of a robot arm according to the present embodiment;
FIG. 4 is a schematic structural diagram of the gripper in this embodiment;
fig. 5 is a schematic structural diagram of the unmanned aerial vehicle in the embodiment;
FIG. 6 is a block diagram showing the structure of the present embodiment;
FIG. 7 is a schematic diagram of the relationship of the inspection system in this embodiment;
FIG. 8 is a flowchart of a process for implementing image sensing;
FIG. 9 is a schematic block diagram of positioning and mapping of the hollow cooperative multi-robot in the present embodiment;
FIG. 10 is a schematic block diagram of a sensing location calculation in the present embodiment;
FIG. 11 is a schematic block diagram of map creation in the present embodiment;
FIG. 12 is a schematic block diagram of multi-information fusion positioning in the present embodiment;
fig. 13a and 13b show a rotor dynamics model of the drone in this embodiment;
fig. 14 is a schematic block diagram of a tracking control system of the unmanned aerial vehicle according to the embodiment;
FIG. 15 is a schematic diagram of a state machine of the UAV of the present embodiment;
FIG. 16 is a schematic block diagram of the design of the DDPG trace tracking controller in this embodiment;
fig. 17 is a schematic view of tracking error design in this embodiment: p is a point with a specified distance L in front of the trolley, q is a track target point, and pq is perpendicular to L;
fig. 18 is a schematic view of a cooperative inspection process of the unmanned aerial vehicle/unmanned aerial vehicle in this embodiment;
FIG. 19 is a schematic diagram illustrating the extraction of event information from sensor data in the present embodiment;
FIG. 20 is a schematic block diagram of an accident prediction correlation model and architecture according to the present embodiment;
fig. 21 is a schematic block diagram of the video recognition of the action behavior of the person in the embodiment.
In the above drawings: 1. a robot platform; 10. a vehicle body; 11. a wheel; 12. a robot arm; 120. a base; 121. a gripper; 1210. a claw body; 1211. a claw hand; 1212. a worm; 1213. a turbine; 122. a first connecting joint; 123. a second connecting joint; 124. a third connecting joint; 125. a fourth connecting joint; 126. a fifth connecting joint; 130. a laser radar; 131. a gas concentration sensor; 132. a wet temperature sensor; 133. a visible light high definition camera; 134. an infrared camera; 135. a rotatable structure; 14. a communicator; 15. a touch screen display; 16. a robot controller; 17. a power supply component; 18. a platform body; 2. an unmanned aerial vehicle; 20. a body; 21. a landing gear; 22. a propeller; 23. a camera.
Detailed Description
The technical solution of the present invention will be described clearly and completely with reference to the accompanying drawings, and obviously, the described embodiments are some, but not all embodiments of the present invention. Based on the embodiments in the present invention, all other embodiments obtained by a person skilled in the art without creative work belong to the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplification of description, but do not indicate or imply that the device or element referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood in specific cases to those skilled in the art.
An air-ground cooperative intelligent inspection robot as shown in fig. 1 and 2 comprises a robot platform 1 and an unmanned aerial vehicle 2. The following describes the robot platform 1 and the drone 2 in detail.
Robot platform 1 includes automobile body 10, sets up wheel and the drive assembly in automobile body 10 bottom, sets up on automobile body 1: a robot arm 12, a context awareness component, a communicator 14, a touch screen display 15, a robot controller 16, and a power component 17. Wherein:
the wheels and the driving component drive the wheels 11 to rotate by four motors matching with the speed reducer, and the four wheels 11 are respectively controlled by the robot controller 16, so that steering is realized through differential speed.
As shown in fig. 3: the robot arm 12 includes a base 120 provided on the vehicle body 10, a joint assembly rotatably connected to the base 120, a gripper 121 rotatably connected to the joint assembly, and a rotary drive member for driving the respective members to rotate. The joint assembly comprises one or more connecting joints which are sequentially and rotatably connected end to end. In this embodiment: the joint assembly includes a first connecting joint 122 rotatably connected to the base 120, a second connecting joint 123 rotatably connected to the first connecting joint 122, a third connecting joint 124 rotatably connected to the second connecting joint 123, a fourth connecting joint 125 rotatably connected to the third connecting joint 124, and a fifth connecting joint 126 rotatably connected to the fourth connecting joint 125, and the gripper 121 is rotatably connected to the fifth connecting joint 126. The rotation driving component adopts a motor, and 6 motors drive 6 rotation joints to realize the motion of 6 degrees of freedom in the mechanical arm 12 and drive the mechanical claw 121 to execute the grabbing operation.
The mechanical arm 12 can adjust the landing position of the unmanned aerial vehicle 2, so that the unmanned aerial vehicle 2 can be charged as wireless; the robotic arm 12 may also replace various sensors for the drone 2; the robot arm 12 may also perform gripping, rotating, pressing, etc. operation tasks in a hazardous environment, such as performing opening and closing operations of a valve.
As shown in fig. 4: the gripper 121 includes a gripper body 1210 rotatably coupled to the fifth coupling joint 126 of the joint assembly, a pair of grippers 1211 coupled to the gripper body 1210, and a gripper driving assembly for driving the grippers 1211 to perform a gripping operation. In this embodiment: the paw hand driving assembly comprises a worm 1212 arranged on the paw body 1210, a worm wheel 1213 connected to one end of the paw hand 1210 and engaged with the worm 1212, and a paw hand driving member for driving the worm 1212 to rotate. The claws 1211 are serrated to help ensure the reliability of the grip. The gripper driving member may be a motor for driving the gripper 121 to perform a gripping function.
The environmental sensing components include a lidar 130, a sensor component, and a camera and camera component. Wherein:
the laser radar 130 is a radar system that detects a characteristic amount of a position, a speed, and the like of an object by emitting a laser beam. The robot platform 1 uses the laser radar 130 to detect a dynamic environment and collects parameters such as target distance, azimuth, height, speed, attitude, shape, and the like. Obstacle information around the robot platform 1 is detected by the laser radar 130, and after raw data of the laser radar 130 is acquired, preprocessing of the data is required. Since the initial laser data is relatively noisy and includes some abnormal data derived from measurement errors and accidental errors, it needs to be filtered, and generally, methods such as low-pass filtering or gaussian filtering can be used. The original data volume is determined by the resolution of the sensor, and is generally huge, so that for the convenience of practical application, the data needs to be subjected to down-sampling processing, and the laser radar data subjected to data filtering and down-sampling processing is applied to relevant sensing functions such as obstacle identification, object detection and the like.
The sensor assembly includes a gas concentration sensor 131, a humidity temperature sensor 132. The gas concentration sensor 131 can detect gas components and concentrations in the air to determine whether hazardous gas leaks, such as concentrations of toxic gases (carbon monoxide, vinyl chloride, hydrogen sulfide, etc.) and flammable gases (hydrogen, methane, ethane, etc.), and once the environmental temperature and humidity or the concentrations of harmful and flammable gases exceed a safety threshold, the concentrations are immediately reported to a worker for processing. The humidity and temperature sensor 132 can acquire the temperature and humidity conditions of the actual inspection area environment.
The image pickup and photographing components comprise a visible light high-definition camera 133, an infrared camera 134 and a monocular camera. The infrared camera 134 is mainly used for night patrol and shooting video images at night. In order to visually perceive and understand the chemical production environment, the monocular camera can acquire and process image information of the working environment. The inspection requirements of chemical environments are generally of the following types: whether the protective equipment of the worker is completely worn; whether the appearance of the production device is intact or not; whether the key indicator light is normal, etc.
In order to solve the problems, the conventional method generally adopts a template matching method for identification, but the method has poor generalization, and the method has poor effect if a matching target in an original image rotates or changes in size. In the embodiment, a deep learning method is adopted to identify images, firstly, image samples are collected for objects in a working scene, corresponding labels are set, then, a deep neural network is built, samples are trained, and the network can identify and classify the images through the training, as shown in fig. 8.
The vehicle body 10 is provided with a rotatable structure 135, and the laser radar 130, the visible light high definition camera 133 and the infrared camera 134 can be arranged on the rotatable structure 135.
Communicator 14 is wireless communicator, and communicator 14 is mainly responsible for carrying out information transmission with unmanned aerial vehicle 2 and basic station to ensure information transmission's unobstructed. The communicator 14 is integrated with inertial navigation equipment and GPS equipment.
The touch screen display 15 provides a human-computer interface for the user to modify the control parameters and collect relevant monitoring data. Through a human-computer interaction interface and a robot system with high intelligence degree, the operation process of the system is simplified, and the labor cost is reduced. A microphone and a loudspeaker are integrated in the touch screen display 15, so that the robot platform 1 has sound data acquisition and audio playing functions.
The power supply module 17 includes an explosion-proof case, a battery and a motor disposed in the explosion-proof case. Explosion-proof capability is indispensable to the chemical industry, and the robot is easy to produce the electric spark at the superhigh current in the course of working, once contact with combustible gas and will produce the danger of explosion, and this kind of hidden danger will be the result that is difficult to imagine. In the inspection process, if the combustible gas leaks, the combustible gas is possibly ignited, the explosion-proof performance is the most important technical characteristic of the system, and the explosion-proof box structure is adopted to ensure the explosion-proof performance.
In addition, the robot platform 1 further includes a platform main body 18 provided on the vehicle body 10 for the unmanned aerial vehicle 1 to take off and land and to be charged, and the platform main body 18 is connected to the power supply module 17. This platform main part 18 provides for unmanned aerial vehicle 2 and carries on and take off and land the platform, and platform main part 18 is circular, and circular platform main part 18 more is applicable to the random error of unmanned aerial vehicle 2 descending positions. Simultaneously, platform main part 18 also provides wireless function of charging for unmanned aerial vehicle 2, has executed at unmanned aerial vehicle 2 and has once patrolled and examined the task back and must in time charge, consequently after unmanned aerial vehicle 2 returns the platform of taking off and landing, platform main part 18 utilizes wireless function of charging to charge it. The high-power wireless power supply technology that platform main part 18 provided need not any physical connection, accomplishes the operation of charging through nonradiative wireless energy transmission mode, can make unmanned aerial vehicle 2 break away from manual operation completely when charging, has improved the automation level that unmanned aerial vehicle patrolled and examined greatly.
As shown in fig. 5 and 6: unmanned aerial vehicle 2 includes fuselage 20, unmanned aerial vehicle control assembly, sets up on fuselage 20: undercarriage 21, propeller group, unmanned aerial vehicle drive and power supply unit and make a video recording and sensing assembly.
The unmanned aerial vehicle control assembly comprises a flight control module, a data transmission module and a picture transmission module, the flight control module comprises a flight control sealing box arranged on the machine body 20, a flight controller arranged in the flight control sealing box, a power supply manager and a parameter adjusting interface, the flight controller is respectively connected with the power supply manager and the parameter adjusting interface, and a barometer, an inertial navigation system and an attitude stability augmentation system are arranged in the flight controller. The data transmission and image transmission module comprises a data transmission and image transmission sealing box arranged on the machine body 20, a data transmission and image transmission unmanned aerial vehicle end arranged in the data transmission and image transmission sealing box, and a data transmission and image transmission ground end connected with the data transmission and image transmission unmanned aerial vehicle end, wherein the data transmission and image transmission unmanned aerial vehicle end is connected with the flight controller.
Unmanned aerial vehicle drive and power supply module include the explosion-proof box of power, set up battery, electronic governor and concentrator in the explosion-proof box of power, motor cabinet, set up the motor on the motor cabinet, and the battery is connected with power manager, and the battery is connected with electronic governor through the concentrator, and electronic governor's input is connected with the accent and participates in the interface, and electronic governor's output is connected with the motor.
The camera and sensing assembly comprises a camera 23, a sensor sealing box, a sensor control panel arranged in the sensor sealing box, and a gas sensor connected with the sensor control panel.
The propeller group is provided with four groups, each group of propeller group is provided with two propellers 22, and the two propellers 22 are arranged up and down. Design double propeller structure can provide stronger flight power for unmanned aerial vehicle to carry on more sensing component in order.
In addition, a plurality of unmanned aerial vehicles can be carried on to a robot platform, do not do the restriction to the style of robot platform yet, including sensor assembly, the installation position of manipulator, the increase/reduction of wheel, close explosion-proof design, these changes also belong to the protection scope of this application.
The following method for air-ground cooperative intelligent inspection is specifically set forth as follows:
firstly, the method comprises the following steps: positioning and drawing an air-ground cooperative multi-robot:
the method mainly comprises the steps of high-precision low-delay perception positioning calculation, air-ground cooperative multi-robot map creation in a chemical environment and multi-information fusion positioning in a high-dynamic chemical environment. Specifically, the method comprises the following steps:
as shown in fig. 10: the perception positioning calculation is realized by three parts including a sensor unit, a clock synchronization device and a computer unit.
The sensor unit collects the information of the environment around the robot by utilizing equipment such as an industrial camera (color and gray), a three-dimensional laser radar, an inertial navigation unit, a GPS (global positioning system) and the like; original data streams generated by the sensors are sent to a computer unit after being synchronized by a clock; after the computer unit collects the sensor data, the data are preprocessed, the processing content comprises point cloud noise filtering, normal vector analysis, feature point extraction, feature description operation and the like, the computer unit is respectively installed on the unmanned aerial vehicle and the robot platform, and effective environment perception and chemical detection data are transmitted back to the ground workstation and then are comprehensively analyzed and processed.
The method mainly solves the problems of map holes, visual angle loss and the like caused by the limited visual angle of a single robot, and is a fundamental guarantee for constructing a high-precision full-coverage environment map. The embodiment carries out three-dimensional reconstruction of a geometric model on the environment of a chemical plant where the robot is located aiming at perception data sent by the unmanned aerial vehicle and the computer unit on the robot platform.
As shown in fig. 11: the map creating step includes:
1. the method comprises the steps that an unmanned aerial vehicle and a robot platform perform modeling scanning on a chemical plant area environment in a remote control mode, the motion tracks of the unmanned aerial vehicle and the robot platform are in a following state, sensor data of respective computer units are collected, and local three-dimensional point cloud maps are created under respective reference coordinate systems;
2. initially aligning motion tracks of the unmanned aerial vehicle and the robot platform by using GPS position information on the unmanned aerial vehicle and the robot platform;
3. extracting ground plane parts in the two local maps, and then optimizing the map alignment of the two local maps based on a plane-plane closest point iterative algorithm;
4. and based on an optimization theory, global optimization adjustment is carried out on the motion trail and the local map acquired by the unmanned aerial vehicle and the robot platform, so that a more accurate chemical plant environment model is obtained.
As shown in fig. 12: and introducing an extended Kalman filtering frame into multi-information fusion positioning, and effectively fusing and calculating relative position information such as inertial navigation integral, wheel type odometers, laser odometers and the like and absolute position information such as Global Position System (GPS) global coordinates, prior maps and current sensing data registration and the like to obtain the position and attitude information of the robot with high precision and low time delay.
Secondly, air-ground cooperative tracking and control:
mainly include unmanned aerial vehicle flight control system design, robot platform orbit tracking control, unmanned aerial vehicle control of descending by oneself. Specifically, the method comprises the following steps:
unmanned aerial vehicle flight control system design:
a dynamic model equation is established for the space six-degree-of-freedom of the unmanned aerial vehicle, an unmanned aerial vehicle actuator model formed by combining a motor model and a propeller aerodynamic model is analyzed, and relevant aerodynamic parameters of the unmanned aerial vehicle are calculated by utilizing actually measured tension and torque curves. The model can calculate the pulling force and moment received by each actuator at the moment according to the input air flow velocity and the speed and angular velocity of the four groups of propeller groups of the unmanned aerial vehicle, so as to realize the simulation verification of the model, as shown in fig. 13a and 13 b.
The model can be divided into a posture model and a position model according to the kinematics and dynamics model of the unmanned aerial vehicle, the same motion control method is divided into a position control part and a posture control part, one position and one unmanned aerial vehicle posture are called as a target point, the path control of the unmanned aerial vehicle is the set of a plurality of target points in the space, and the unmanned aerial vehicle needs to arrive at the planning target points in sequence according to the sequence of the target points, as shown in fig. 14.
In the attitude control of the unmanned aerial vehicle, the active disturbance rejection controller is used for realizing the control of the height, the yaw, the pitch and the rolling motion parameters of the unmanned aerial vehicle, and in the position control, the position controller based on a backstepping method is designed, so that the unmanned aerial vehicle can complete the tracking of a target track. In order to enable the unmanned aerial vehicle to achieve flexible maneuvering capability in a complex environment, a state machine controller needs to be designed for the controller, and the controller can automatically adjust the flying attitude and position of the robot according to different environmental states of the unmanned aerial vehicle, as shown in fig. 15.
Tracking and controlling the track of the robot platform:
and (3) performing trajectory tracking control by using reinforcement learning, wherein the reinforcement learning is a method for learning the controller without knowing control and mechanical knowledge. The reinforcement learning emphasizes the interaction with the environment, and is a dynamic learning process. The depth certainty strategy gradient algorithm (DDPG) is a depth reinforcement learning algorithm, inherits the characteristics of the strategy gradient algorithm and an actor-critic algorithm, and controls the wheeled robot to track and plan a path according to the state information of the wheeled robot and the information fed back by the environment by using the DDPG algorithm.
Firstly, the actual position of the robot and the target point on the planned track calculate the driving error through an error function, the error information is transmitted to the DDPG network, the DDPG network senses the environment through state description, makes an optimal decision according to the current environment state information, guides self learning through setting a reward function, and finally realizes high-precision tracking of the planned path, as shown in FIG. 16. The error function can be designed by utilizing the transverse error of the robot, the robot obtains the current position through a sensing system, the preset track position is obtained by a path planning system, the distance between a supposed point p in front of the robot and the target q is calculated to serve as the value of the error function, and the robot is continuously guided to walk along the track by utilizing the transverse error.
The DDPG algorithm needs pre-training of the network before being used, and usually simulation training of a real environment is carried out on a simulator. In the traditional DDPG algorithm, too few training samples cause low training efficiency, and a network cannot be converged quickly, and by improving the strategy of putting the samples in the algorithm back to a training experience pool, when the samples are few, the network training is not carried out, the robot continues to explore, and the number of the filled samples achieves the aim of accelerating the training; meanwhile, in order to solve the complex environment, the exploration cost of the robot is too high, a large amount of useless work is consumed in a large amount of trial and error processes in the early stage, the DDPG network in the simple environment is pre-trained in a transfer learning mode, the trained network is placed in the complex environment, the environment complexity is gradually increased, and the network obtains the ability of generating the motion strategy in the complex environment.
Unmanned aerial vehicle autonomous landing control:
the robot platform (unmanned vehicle UGV) is used as a carrier of an Unmanned Aerial Vehicle (UAV), and can realize automatic take-off and automatic visual guidance landing of the UAV. In a normal routing inspection task, the unmanned vehicle performs routing inspection work according to a preset routing inspection path, under certain scenes, the unmanned vehicle cannot directly reach routing inspection points, and then the unmanned vehicle can reach the routing inspection points and is used as an extra eye of the unmanned vehicle to transmit routing inspection information to a robot controller of the unmanned vehicle; after the unmanned aerial vehicle finishes executing the information acquisition task, the unmanned aerial vehicle can automatically land on a platform main body of the unmanned aerial vehicle under the guidance of the visual identification, and then the inspection task is finished. Unmanned aerial vehicle independently descends and needs the adjustment of control unmanned aerial vehicle's height and gesture, and this needs to design the visual identification that has the directionality and certain specification. Unmanned aerial vehicle flies to the unmanned aerial vehicle sky through planning system's planning route at first, and rethread vision guide system searches for the visual identification, in case detect the visual identification, just starts the automated guidance and descends the procedure, realizes unmanned aerial vehicle's autonomic descending, and wherein the controller of control unmanned aerial vehicle descending in-process can use fuzzy controller, increases the ride comfort of descending process, as shown in fig. 18.
Thirdly, the method comprises the following steps: detecting and early warning accidents:
the present embodiment will build a discrete event system model for the operation dynamics of the chemical plant to analyze the sensor data with the theory related to the discrete event system. For data of different structures such as action behaviors of personnel, temperature, gas concentration and the like in the system, a sensor data dictionary is established according to a data structure and a data format, a sensor data packet analysis method, comprehensive logic judgment, deep learning and other methods are established, required feature information in the data is extracted, and then a strong mapping relation between the feature information and event information is established by combining information theory knowledge, as shown in fig. 19.
According to faults and accidents which may occur in an actual system, a mapping relation from the faults to sensor events and a mapping relation from the accidents to sensor event sequences are established, namely, the system faults are modeled into the fault events in the system, the system accidents are modeled into a combination of a series of event sequences, a system diagnostor based on a state tree is established according to the established mapping relation, and the diagnostor carries out fault detection and accident prediction on line in real time by observing the system events. When the system is detected to be in fault, the diagnotor sends out a system warning; for accident prediction, the diagnostician calculates the probability of an accident occurring in real time, and issues a warning when the probability exceeds a threshold set by the system, as shown in fig. 20.
Aiming at the personnel safety problem in the production process of chemical enterprises, the video monitoring technology is adopted to realize the detection and tracking of personnel and the identification of specific behaviors, the requirements of speed and precision of the behavior identification and analysis of the personnel in a microenvironment are comprehensively considered when the personnel is detected and tracked, a pedestrian detection and tracking algorithm combining depth characteristics and artificial characteristics can be adopted, the optimal personnel detection precision is achieved while the real-time requirement of a system is met, meanwhile, the object detection algorithm based on deep learning is combined to automatically identify whether the personnel in a production area wears safety helmets, protective eyepieces, long-sleeve work clothes and the like or not, when the identification and analysis of the specific behaviors of the personnel are carried out, a human face identification method based on a neural network is adopted to identify a tracked target, multiple tracking information is classified and fused, the relationship between the individual behaviors and motion tracks is analyzed, The relationship between the individual behaviors and the group behaviors, an automatic model of the individual and group behaviors is established, and the behaviors are identified by using the discrete event system related knowledge, as shown in fig. 21.
The embodiment can achieve the following beneficial effects:
1. unmanned vehicle and unmanned aerial vehicle's synergism:
the utility model designs a wheeled robot platform for carrying unmanned aerial vehicle, which can meet the carrying, taking off and landing requirements of the inspection unmanned aerial vehicle, get rid of the limitation that the working personnel need to carry the unmanned aerial vehicle to the site, and greatly improve the autonomous control capability of the unmanned aerial vehicle; the wheeled robot platform can carry an unmanned aerial vehicle and carry a large-scale high-energy battery, and the unmanned aerial vehicle is charged when the electric quantity of the unmanned aerial vehicle is insufficient; the wheeled robot platform can also carry different types of sensor modules, including devices such as a visible light camera, an infrared camera, a laser radar, a satellite navigation system receiver and the like, so as to make up for the deficiency of the load capacity of the unmanned aerial vehicle; unmanned aerial vehicle can select required sensor module on the platform, uses robotic arm to change sensor module for unmanned aerial vehicle, has improved unmanned aerial vehicle greatly and has patrolled and examined ability and efficiency.
2. The explosion-proof performance is good:
the utility model discloses use explosion-proof box isolation mode to keep apart instantaneous high tension current with the external world to make and patrol and examine the robot and possess explosion-proof performance.
3. High intelligent level of environmental perception and understanding:
the utility model discloses having fused multiple sensor, having carried out perception and understanding to the chemical production environment, having proposed through gathering the picture sample at the scene and train the degree of depth neural network, realized the discernment and the detection of people or equipment in the chemical production environment, compared in traditional identification method, the testing result is more reliable, and the model generalization nature is better.
4. Synchronous mapping and positioning of dynamic environment:
towards complicated changeable chemical industry environment, the utility model discloses introduced the space in coordination with multi-robot SLAM method, broken through the regional full coverage environment modeling technique of chemical plant on a large scale, attack the multisensor and fuse the positioning technique, realize patrolling and examining high accuracy modeling and safe and reliable location of robot under mixed chemical industry environment.
5. High-speed high-precision path planning technology:
the utility model discloses apply to the solution that the route was sought after with mixed algorithm for the algorithm is more high-efficient, with the kinematics constraint merge into among the global planning, make the orbit more reasonable, easily trail, can accomplish the space collaborative path planning, and mutually support the motion when guaranteeing separately that the route is feasible.
6. Accident early warning capability:
the utility model discloses to the accident that probably takes place in the chemical enterprise production process, designed one set of heterogeneous data accident prediction algorithm of integration based on discrete event system theory, can patrol and examine robot platform steady operation in sky-ground intelligence in coordination, realize patrolling and examining the work that the robot accomplished safety and patrolled and examined in the chemical industry operation environment, cross supervision chemical enterprise production process accident safety problem, greatly reduce the incident, guarantee chemical enterprise manufacturing property and personnel life safety.
The above embodiments are only for illustrating the technical concept and features of the present invention, and the purpose of the embodiments is to enable people skilled in the art to understand the contents of the present invention and to implement the present invention, which cannot limit the protection scope of the present invention. All equivalent changes and modifications made according to the spirit of the present invention should be covered by the protection scope of the present invention.

Claims (10)

1. The utility model provides a robot is patrolled and examined to air and ground collaborative formula intelligence which characterized in that: including robot platform, unmanned aerial vehicle, robot platform include the automobile body, set up and be in bottom of the vehicle body's wheel and drive assembly, setting be in the automobile body on: robotic arm, environment perception subassembly, communicator, robot control ware and power supply module, the communicator right unmanned aerial vehicle and base station realize communication connection.
2. The air-ground cooperative intelligent inspection robot according to claim 1, wherein: the mechanical arm comprises a base arranged on the vehicle body, a joint assembly rotationally connected to the base, a mechanical claw rotationally connected to the joint assembly, and a rotation driving piece driving each component to rotate, wherein the joint assembly comprises one or more connecting joints rotationally connected end to end in sequence.
3. The air-ground cooperative intelligent inspection robot according to claim 2, wherein: the mechanical claw comprises a claw body rotatably connected to the joint assembly, a pair of claws connected to the claw body and a claw driving assembly for driving the claws to grab, wherein the claw driving assembly comprises a worm arranged on the claw body, a worm wheel connected to one end of the claw and matched with the worm, and a claw driving piece for driving the worm to rotate.
4. The air-ground cooperative intelligent inspection robot according to claim 1, wherein: the robot platform still including setting up the automobile body on supply unmanned aerial vehicle take off and land and the platform main part that charges, the platform main part with power supply module be connected.
5. The air-ground cooperative intelligent inspection robot according to claim 1, wherein: the environment sensing assembly comprises a laser radar, a sensor assembly and a camera shooting and photographing assembly.
6. The air-ground cooperative intelligent inspection robot according to claim 1, wherein: unmanned aerial vehicle include fuselage, unmanned aerial vehicle control assembly, set up the fuselage on: undercarriage, screw group, unmanned aerial vehicle drive and power supply module and make a video recording and sensing component.
7. The air-ground cooperative intelligent inspection robot according to claim 6, wherein: the unmanned aerial vehicle control assembly comprises a flight control module, a data transmission module and a graph transmission module, wherein the flight control module comprises a flight control sealing box arranged on the airframe, a flight controller, a power supply manager and a parameter adjusting interface arranged in the flight control sealing box, and the flight controller is respectively connected with the power supply manager and the parameter adjusting interface; the data transmission and image transmission module comprises a data transmission and image transmission sealing box arranged on the airplane body, a data transmission and image transmission unmanned aerial vehicle end arranged in the data transmission and image transmission sealing box, and a data transmission and image transmission ground end connected with the data transmission and image transmission unmanned aerial vehicle end, wherein the data transmission and image transmission unmanned aerial vehicle end is connected with the flight controller.
8. The air-ground cooperative intelligent inspection robot according to claim 7, wherein: the unmanned aerial vehicle drive and power supply assembly comprises a power supply explosion-proof box, a battery, an electronic speed regulator, a concentrator, a motor base and a motor, wherein the battery, the electronic speed regulator, the concentrator, the motor base and the motor are arranged in the power supply explosion-proof box, the battery is connected with the power supply manager, the battery is connected with the electronic speed regulator through the concentrator, the input end of the electronic speed regulator is connected with the parameter adjusting interface, and the output end of the electronic speed regulator is connected with the motor.
9. The air-ground cooperative intelligent inspection robot according to claim 6, wherein: the camera shooting and sensing assembly comprises a camera, a sensor sealing box, a sensor control plate arranged in the sensor sealing box and a gas sensor connected with the sensor control plate.
10. The air-ground cooperative intelligent inspection robot according to claim 6, wherein: the propeller groups are provided with four groups, each group of propeller groups is provided with two propellers, and the two propellers are arranged up and down.
CN202020466444.5U 2020-04-02 2020-04-02 Air-ground cooperative intelligent inspection robot Active CN211890820U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202020466444.5U CN211890820U (en) 2020-04-02 2020-04-02 Air-ground cooperative intelligent inspection robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202020466444.5U CN211890820U (en) 2020-04-02 2020-04-02 Air-ground cooperative intelligent inspection robot

Publications (1)

Publication Number Publication Date
CN211890820U true CN211890820U (en) 2020-11-10

Family

ID=73272077

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202020466444.5U Active CN211890820U (en) 2020-04-02 2020-04-02 Air-ground cooperative intelligent inspection robot

Country Status (1)

Country Link
CN (1) CN211890820U (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111300372A (en) * 2020-04-02 2020-06-19 同济人工智能研究院(苏州)有限公司 Air-ground cooperative intelligent inspection robot and inspection method
CN112667717A (en) * 2020-12-23 2021-04-16 贵州电网有限责任公司电力科学研究院 Transformer substation inspection information processing method and device, computer equipment and storage medium
CN113467453A (en) * 2021-07-05 2021-10-01 天津理工大学 Inspection robot and method for controlling inspection robot to run based on fuzzy PID
CN114313037A (en) * 2021-12-24 2022-04-12 中国兵器工业计算机应用技术研究所 Ground-air cooperative unmanned automatic equipment
CN118636107A (en) * 2024-07-12 2024-09-13 山东瓦利斯智能科技有限公司 A multi-axis mechanical arm inspection robot and an inspection method for a working space

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111300372A (en) * 2020-04-02 2020-06-19 同济人工智能研究院(苏州)有限公司 Air-ground cooperative intelligent inspection robot and inspection method
WO2021196529A1 (en) * 2020-04-02 2021-10-07 同济人工智能研究院(苏州)有限公司 Air-ground cooperative intelligent inspection robot and inspection method
CN111300372B (en) * 2020-04-02 2024-12-10 同济人工智能研究院(苏州)有限公司 Air-ground collaborative intelligent inspection robot and inspection method
CN112667717A (en) * 2020-12-23 2021-04-16 贵州电网有限责任公司电力科学研究院 Transformer substation inspection information processing method and device, computer equipment and storage medium
CN112667717B (en) * 2020-12-23 2023-04-07 贵州电网有限责任公司电力科学研究院 Transformer substation inspection information processing method and device, computer equipment and storage medium
CN113467453A (en) * 2021-07-05 2021-10-01 天津理工大学 Inspection robot and method for controlling inspection robot to run based on fuzzy PID
CN114313037A (en) * 2021-12-24 2022-04-12 中国兵器工业计算机应用技术研究所 Ground-air cooperative unmanned automatic equipment
CN118636107A (en) * 2024-07-12 2024-09-13 山东瓦利斯智能科技有限公司 A multi-axis mechanical arm inspection robot and an inspection method for a working space

Similar Documents

Publication Publication Date Title
CN111300372B (en) Air-ground collaborative intelligent inspection robot and inspection method
CN211890820U (en) Air-ground cooperative intelligent inspection robot
Luo et al. A survey of intelligent transmission line inspection based on unmanned aerial vehicle
CN1305194C (en) Power circuit scanning test robot airplane and controlling system
CN101477169B (en) Electric power circuit detection method by polling flying robot
CN113671994A (en) Multi-unmanned aerial vehicle and multi-unmanned ship inspection control system based on reinforcement learning
CN111198004A (en) Electric power inspection information acquisition system based on unmanned aerial vehicle
CN102941920A (en) High-tension transmission line inspection robot based on multi-rotor aircraft and method using robot
CN102190081B (en) Vision-based fixed point robust control method for airship
Lee et al. Artificial intelligence and internet of things for robotic disaster response
CN103078673A (en) Special unmanned helicopter system suitable for routing inspection on power grid in mountain area
CN208873047U (en) A kind of inspection device based on multi-rotor unmanned aerial vehicle
CN109101039A (en) Vertical detection method and system
CN118550259A (en) Intelligent unmanned ocean monitoring network system and operation method
CN111459190A (en) Unmanned aerial vehicle for automatic inspection of large-scale centralized photovoltaic power station and inspection method
Tsintotas et al. The MPU RX-4 project: Design, electronics, and software development of a geofence protection system for a fixed-wing vtol uav
CN115847436A (en) Mobile gas acquisition, analysis, early warning and inspection robot
CN207249489U (en) A kind of unmanned plane and robot link job platform in the air
CN112233270A (en) Unmanned aerial vehicle is intelligence around tower system of patrolling and examining independently
CN117386567B (en) A wind turbine blade detection method and system
CN221234074U (en) Unmanned aerial vehicle surveys of multidimensional operation
Angelis et al. UAV design for fully autonomous man overboard detection
CN117850465A (en) Unmanned aerial vehicle flight control system and method for forest fire prevention
Ollero et al. Multi-Aerial Robotic System for Power Line Inspection and Maintenance: Comparative Analysis from the AERIAL-CORE Final Experiments
CN114397909A (en) Automatic inspection method for small unmanned aerial vehicle of large airplane

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant