CN115256468B - A state detection and standing planning method for a humanoid robot after falling down - Google Patents
A state detection and standing planning method for a humanoid robot after falling down Download PDFInfo
- Publication number
- CN115256468B CN115256468B CN202211034551.0A CN202211034551A CN115256468B CN 115256468 B CN115256468 B CN 115256468B CN 202211034551 A CN202211034551 A CN 202211034551A CN 115256468 B CN115256468 B CN 115256468B
- Authority
- CN
- China
- Prior art keywords
- robot
- state
- standing
- value
- planning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 238000001514 detection method Methods 0.000 title claims abstract description 26
- 238000004364 calculation method Methods 0.000 claims abstract description 7
- 238000005259 measurement Methods 0.000 claims description 14
- 230000007704 transition Effects 0.000 claims description 6
- 230000002159 abnormal effect Effects 0.000 claims description 4
- 210000003127 knee Anatomy 0.000 claims description 4
- 210000003423 ankle Anatomy 0.000 claims description 3
- 210000001142 back Anatomy 0.000 claims description 3
- 210000001061 forehead Anatomy 0.000 claims description 3
- 210000003128 head Anatomy 0.000 claims description 3
- 210000001202 rhombencephalon Anatomy 0.000 claims description 3
- 239000010408 film Substances 0.000 description 7
- 238000010586 diagram Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000002414 leg Anatomy 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000011664 nicotinic acid Substances 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/0095—Means or methods for testing manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
The invention provides a state detection and standing planning method for a humanoid robot after falling, which is characterized in that after falling, the position and the direction of each connecting rod are calculated according to the direction of the robot and the angle value of a motor corresponding to a joint, meanwhile, the position and the direction of each connecting rod are acquired by a motion capture clothes, the position and the direction of each connecting rod are respectively compared, when the absolute value of the difference value between the two is smaller than or equal to a set threshold value, the structure of the robot body is normal, the external environment where the robot is positioned is detected, so that whether the robot can stand is judged, if the robot has standing capability, the distance between a real-time state value and a reference value corresponding to the state which must be experienced in the standing process is calculated by using a Mahalanobis distance calculation method, and the state corresponding to the minimum of the arrival distance of the corresponding joint of the robot is controlled. The invention solves the problems of state detection and standing planning of the robot in a complex environment, and improves the adaptability of the robot.
Description
Technical Field
The invention relates to the technical field of humanoid robots, in particular to a method for detecting the state of a fallen humanoid robot and planning standing.
Background
The purpose of a human-simulated robot is to hope that it can help or replace a human to complete a certain task, but due to the complexity of the environment, a fall is likely to occur when the simulated robot performs a certain task. Because the humanoid robot cannot sense the position of the body after a fall and judge that the joints are damaged after a fall like a human, a method is required to detect the state of the robot after a fall and decide whether to continue to complete a task according to the detected state.
The prior art focuses on studying how a humanoid robot avoids a fall or how a protective action is completed when a fall is unavoidable, but there are few studies on detecting the state of the robot after a fall.
Disclosure of Invention
In view of the above, the invention provides a method for detecting the state of a humanoid robot after falling and planning standing.
The present invention achieves the above technical object by the following means.
A state detection and standing planning method for a humanoid robot after falling down comprises the following steps:
After the robot falls down, the industrial personal computer calculates the position and the direction of each connecting rod according to the direction of the robot and the angle value of the motor corresponding to the joint, and the position and the direction are recorded as S1; meanwhile, the motion capture clothes acquire the position and the direction of each connecting rod, record as S2, and transmit the position and the direction to the industrial personal computer;
If the I S1-S2I is less than or equal to E, the robot body structure is all normal, the external environment where the robot is positioned is detected, and the robot standing is planned; if the absolute value of S1-S2 is more than E, the robot body is abnormal and needs to be overhauled immediately; where E is a set threshold.
In the above technical scheme, the detection of the external environment where the robot is located specifically comprises: and installing a piezoelectric film sensor at a key collision point of the robot, and judging that the body part corresponding to the key collision point is contacted with the ground if the initial value of the piezoelectric film sensor changes.
In the above technical solution, the key collision points include forehead, hindbrain, forechest, back, front waist, rear waist, left and right palms, left and right dorsum, left and right knees, front heel and rear heel of the robot.
In the above technical scheme, plan robot standing, specifically: and calculating the distance between the real-time state value of the robot and the reference value corresponding to the state which is necessary to be experienced in the standing process, and controlling the corresponding joint of the robot to reach the state corresponding to the minimum distance value.
In the above technical solution, the real-time state value of the robot and the reference value corresponding to the state that the standing process must experience include: the motor angle value of the joint, the position and the direction of the connecting rod, the data acquired by the inertial measurement unit, the data acquired by the force sensor and the value of the key collision point.
In the above technical solution, after the corresponding joint of the control robot reaches the state corresponding to the minimum distance value, detecting whether the angle value of the joint motor, the position and direction of the connecting rod, the data collected by the inertial measurement unit, the data collected by the force sensor and the value of the key collision point are set reference values, if yes, continuing the next state transition, otherwise, repeating the state detection, searching the optimal state, and controlling the corresponding joint to reach the optimal state.
In the technical scheme, the distance is calculated by a Markov distance calculation method.
In the technical scheme, the inertial measurement unit is arranged on the head of the robot, and the force sensors are arranged at two ankles of the robot.
The beneficial effects of the invention are as follows: the invention firstly detects the state of the robot after falling, including the detection of the posture based on the body and the detection based on the external environment, judges whether the robot has standing capability according to the detected state, if the robot has standing capability, calculates the distance between the real-time state value and the reference value corresponding to the state which must be experienced in the standing process by using a Markov distance calculation method, and controls the corresponding joint of the robot to reach the state corresponding to the minimum distance value, thereby completing the standing of the robot; the invention solves the problems of state detection and standing planning of the robot in a complex environment, and improves the adaptability of the robot.
Drawings
FIG. 1 is a flow chart of a method for detecting and planning standing after a fall of a humanoid robot according to the present invention;
FIG. 2 is a schematic diagram of the structure of the humanoid robot of the present invention;
FIG. 3 is a schematic view of a key collision point of a robot according to the present invention;
fig. 4 is a diagram showing a robot standing state transition according to the present invention.
Detailed Description
The invention will be further described with reference to the drawings and the specific embodiments, but the scope of the invention is not limited thereto.
When the humanoid robot performs a certain task, a tumbling action is easily generated due to the complexity of the environment. After a fall has occurred, a human expects the robot to stand up immediately, continuing to complete the task. However, the humanoid robot has no sensitive human body sensing function, so that after a fall occurs, the robot cannot judge what state the robot body is in and cannot know which part of the body is in contact with the ground, and thus standing is very difficult.
In view of the above problems, the present invention provides a method for detecting a state after a fall and planning standing of a humanoid robot, as shown in fig. 1, which includes body-based gesture detection and external environment-based detection.
Fig. 2 is a structure of a humanoid robot designed according to a bionic principle, the robot has 20 degrees of freedom (namely 20 joints), 20 joint code plates and 20 joint motors correspondingly, an Inertial Measurement Unit (IMU) is installed at the head of the robot, and a force sensor is installed at each of two ankles of the robot.
The invention particularly describes a method for detecting the state of a humanoid robot after falling and planning standing.
Body-based gesture detection: when the robot falls down is detected (the detection method is the prior art), an Inertial Measurement Unit (IMU) acquires the direction of the robot and transmits the direction to an industrial personal computer, and the industrial personal computer acquires motor angle values corresponding to 20 joints of the robot, so that the position and the direction of each connecting rod are calculated by utilizing the principle of positive kinematics of the robot and recorded as S1; the process of calculating the position and the direction of each connecting rod by the direction of the robot and the angle value of the motor is the prior art. Because the phenomenon that the joint code disc is abnormal, the connecting rods are broken or fall off occurs after the robot falls down, the positions and the directions of the connecting rods are acquired according to the motion capture clothes arranged on the robot body and recorded as S2, and meanwhile, the positions and the directions are transmitted to the industrial personal computer. After the two arrays are obtained, the positions and the directions of the connecting rods are respectively compared, and when the absolute value of the difference value between the two is smaller than or equal to a set threshold E, the robot body structure is considered to be normal, and the robot can continue to work; when the absolute value of the difference value between the two is larger than the set threshold value E, the robot body is considered to be abnormal, and the robot body needs to be overhauled immediately.
Detection based on external environment: when the body-based gesture detection is completed, the robot body structure is all normal, and the external environment where the robot is located is required to be detected at the moment, so that the standing task of the robot is completed. Piezoelectric film sensors are mounted at key collision points of the robot, including the forehead, the hindbrain, the forebreast, the back, the front waist, the rear waist, the left and right palms, the left and right dorsum, the left and right knees, the front heel and the rear heel of the robot, as shown in fig. 3. The data collected by the piezoelectric film sensor is transmitted to the industrial personal computer, the initial value of the piezoelectric film sensor is set to be 0, and when the piezoelectric film sensor receives pressure, the initial value becomes 1, so that the position of the body, which is in contact with the ground, can be judged, and the data collected by the force sensor is transmitted to the industrial personal computer in real time.
When the body-based posture detection and the external environment-based detection are completed, the motor angle values of the 20 joints of the robot, the positions and directions of the connecting rods, the data acquired by an Inertial Measurement Unit (IMU), the data acquired by a force sensor, and the values of key collision points (i.e., the data acquired by the piezoelectric thin film sensors) can be obtained, and the values are set as a plurality of groups Mx.
Robot standing planning: regarding the standing planning of the robot, the standing process of the robot is divided into a plurality of states A, B and C … … which are required to be experienced by the robot in a state transition mode, wherein the states comprise motor angle values of 20 joints of the robot, positions and directions of connecting rods, data acquired by an Inertial Measurement Unit (IMU), data acquired by a force sensor and values of key collision points, and the states are respectively set into a plurality of groups M A,MB,MC … …; when the robot wants to stand, according to the state value (namely, the array Mx obtained by completing the state detection) at the moment, the reference value M A,MB,MC … … corresponding to the necessary state is calculated by using the Markov distance calculation method, the distance between the state value at the moment and the reference value is minimum, and once the minimum value is found, the corresponding joint of the robot is controlled to reach the state corresponding to the minimum value.
Since the robot may have some interference slip phenomenon after reaching the target point (state corresponding to the minimum value), it is necessary to perform secondary detection to prevent the false arrival phenomenon. The specific method comprises the following steps: after the robot reaches the target point, detecting whether the values of the angle values of 20 joint motors, the positions and the directions of the connecting rods, data acquired by an Inertial Measurement Unit (IMU), data acquired by a force sensor and key collision points are set reference values (M A,MB,MC … …), and if so, continuing the next state transition; if not, repeating the state detection, calculating the minimum distance between the state value of the target point and which reference value by using a Markov distance calculation method, searching for an optimal state, and controlling the corresponding joint to reach the state.
Examples: as shown in the lower left corner of fig. 4, when the robot falls down, the state is unknown, the robot starts to start body-based gesture detection, firstly, the direction of the robot is obtained according to an Inertial Measurement Unit (IMU), then, the position and the direction of each connecting rod are calculated according to the motor angle values of 20 joints by using the principle of positive kinematics of the robot, and a group of data is obtained and recorded as S1; then starting a motion capture garment on the robot, recording the transmitted data as S2 as the position and the direction of each connecting rod can be transmitted to the industrial personal computer in real time after the motion capture garment is started, subtracting the two groups of data, taking an absolute value, indicating that each connecting rod is normal without phenomena such as breakage, falling and the like if the absolute value is smaller than a set threshold E, and then starting detection based on an external environment by the robot; otherwise, the robot stops working. If the robot can work normally, starting to detect key collision points, taking the left lower corner illustration as an example, at the moment, the readings of piezoelectric film sensors corresponding to the back, the back waist, the right knee, the right leg and the heel in the key collision points of the robot are changed, the key collision points are in contact with the ground, and the values corresponding to the states are set as motor angle values of a plurality of Mx= [20 joints; the position and direction of the links; data collected by an Inertial Measurement Unit (IMU); data collected by the force sensor; the value of the critical collision point ]. Since several states that the robot must experience during standing, such as state a, state B, state C … … in the figure, are stored inside the robot, its corresponding value is set to be M A,MB,MC … …; then, the state value Mx at the moment is calculated by using the Marsh distance and is relatively close to the reference state, and the state A is found to be nearest through calculation and comparison, so that the corresponding joint of the robot is controlled to reach the state A. Because the robot has the interference sliding phenomenon on the sole in the process of reaching the state A, although the joint motor reaches a corresponding angle, other parameter values possibly do not reach, so that secondary detection is needed, if the detected state value is correct, the robot reaches the target state, the next target state transition can be continued, as shown in the state B, as shown in the state C, and finally standing; if the detected state value is incorrect, for example, the robot is disturbed in the standing process, the robot changes from an unknown state to a state H, when the state H is reached, the angle values of the 20 joints, the position and the direction of the connecting rod, the data acquired by an Inertial Measurement Unit (IMU), the data acquired by a force sensor and the values of key collision points are set reference values M A, and if the state A is not the expected state A, the optimal state is found according to the state at the moment, the finding result is the state D, the robot joint is controlled to reach the target state D, and then the state C is finally stood.
The examples are preferred embodiments of the present invention, but the present invention is not limited to the above-described embodiments, and any obvious modifications, substitutions or variations that can be made by one skilled in the art without departing from the spirit of the present invention are within the scope of the present invention.
Claims (6)
1. A method for detecting and planning standing after the falling state of a humanoid robot is characterized in that:
After the robot falls down, the industrial personal computer calculates the position and the direction of each connecting rod according to the direction of the robot and the angle value of the motor corresponding to the joint, and the position and the direction are recorded as S1; meanwhile, the motion capture clothes acquire the position and the direction of each connecting rod, record as S2, and transmit the position and the direction to the industrial personal computer;
If the I S1-S2I is less than or equal to E, the robot body structure is all normal, the external environment where the robot is positioned is detected, and the robot standing is planned; if the absolute value of S1-S2 is more than E, the robot body is abnormal and needs to be overhauled immediately; wherein E is a set threshold;
Planning the standing of the robot, specifically: calculating the distance between the real-time state value of the robot and the reference value corresponding to the state which must be experienced in the standing process, and controlling the corresponding joint of the robot to reach the state corresponding to the minimum distance value;
After the corresponding joint of the control robot reaches the state corresponding to the minimum distance value, detecting whether the angle value of the joint motor, the position and the direction of the connecting rod, data acquired by the inertial measurement unit, data acquired by the force sensor and the value of the key collision point are set reference values or not, if so, continuing the next state transition, otherwise, repeating the state detection, searching the optimal state, and controlling the corresponding joint to reach the optimal state.
2. The method for detecting and planning standing after a fall of a humanoid robot according to claim 1, wherein the detection of the external environment in which the robot is located is specifically: and installing a piezoelectric film sensor at a key collision point of the robot, and judging that the body part corresponding to the key collision point is contacted with the ground if the initial value of the piezoelectric film sensor changes.
3. The method for detecting and planning standing after a fall of a humanoid robot of claim 2, wherein the key collision points include forehead, hindbrain, forechest, back, front waist, back waist, left and right palms, left and right dorsum, left and right knees, front heels, and rear heels of the robot.
4. The method for detecting and planning standing after a fall of a humanoid robot according to claim 1, wherein the real-time state value of the robot and the reference value corresponding to the state that must be experienced in the standing process each include: the motor angle value of the joint, the position and the direction of the connecting rod, the data acquired by the inertial measurement unit, the data acquired by the force sensor and the value of the key collision point.
5. The method for detecting and planning standing after a fall of a humanoid robot according to claim 1, wherein the distance is calculated by a mahalanobis distance calculation method.
6. The method for detecting and planning standing after a fall of a humanoid robot of claim 5, wherein the inertial measurement unit is mounted on a robot head and the force sensors are mounted on both ankles of the robot.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211034551.0A CN115256468B (en) | 2022-08-26 | 2022-08-26 | A state detection and standing planning method for a humanoid robot after falling down |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211034551.0A CN115256468B (en) | 2022-08-26 | 2022-08-26 | A state detection and standing planning method for a humanoid robot after falling down |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115256468A CN115256468A (en) | 2022-11-01 |
CN115256468B true CN115256468B (en) | 2024-11-22 |
Family
ID=83755753
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211034551.0A Active CN115256468B (en) | 2022-08-26 | 2022-08-26 | A state detection and standing planning method for a humanoid robot after falling down |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115256468B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005186183A (en) * | 2003-12-25 | 2005-07-14 | Nachi Fujikoshi Corp | Industrial robot and its abnormality judging method |
CN111409073A (en) * | 2020-04-02 | 2020-07-14 | 深圳国信泰富科技有限公司 | Tumbling self-recovery method and system for high-intelligence robot |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2964055B1 (en) * | 2010-08-27 | 2012-08-17 | Aldebaran Robotics S A | HUMANOID ROBOT WITH FALL MANAGEMENT CAPABILITIES AND METHOD OF MANAGING FALLS |
CN105252532B (en) * | 2015-11-24 | 2017-07-04 | 山东大学 | The method of the flexible gesture stability of motion capture robot collaboration |
US10335962B1 (en) * | 2017-03-01 | 2019-07-02 | Knowledge Initiatives LLC | Comprehensive fault detection and diagnosis of robots |
CN109866218B (en) * | 2017-12-01 | 2021-04-20 | 优必选教育(深圳)有限公司 | Robot tumble standing control method and device |
CN109605364A (en) * | 2018-10-31 | 2019-04-12 | 北京理工大学 | A fall detection and stability control method for a humanoid robot |
CN111086024A (en) * | 2019-12-18 | 2020-05-01 | 南京熊猫电子股份有限公司 | Monitoring system and monitoring method applied to industrial robot |
CN112405568A (en) * | 2020-10-20 | 2021-02-26 | 同济大学 | Humanoid robot falling prediction method |
CN112859904A (en) * | 2021-01-25 | 2021-05-28 | 乐聚(深圳)机器人技术有限公司 | Method, device and equipment for recovering standing posture of robot and storage medium |
-
2022
- 2022-08-26 CN CN202211034551.0A patent/CN115256468B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005186183A (en) * | 2003-12-25 | 2005-07-14 | Nachi Fujikoshi Corp | Industrial robot and its abnormality judging method |
CN111409073A (en) * | 2020-04-02 | 2020-07-14 | 深圳国信泰富科技有限公司 | Tumbling self-recovery method and system for high-intelligence robot |
Also Published As
Publication number | Publication date |
---|---|
CN115256468A (en) | 2022-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190354080A1 (en) | Abnormality detector | |
US10220518B2 (en) | Touch-down sensing for robotic devices | |
US11858140B2 (en) | Robot system and supplemental learning method | |
Khalastchi et al. | Online anomaly detection in unmanned vehicles | |
US9555543B2 (en) | Robot with joints of variable rigidity and method for calculating said optimized rigidity | |
CN112549024B (en) | Robot sensorless collision detection method based on time series analysis and application | |
US10893719B2 (en) | Smart shoe module | |
CN110193828B (en) | Method and device for identifying state of mobile robot | |
CN109506641A (en) | The pose loss detection and relocation system and robot of mobile robot | |
CN105459120A (en) | Human-collaborative robot system | |
Matsuno et al. | Adaptive update of reference capacitances in conductive fabric based robotic skin | |
KR20180088241A (en) | Method, Apparatus and System for measuring body balance using Smart Insole | |
US10962957B2 (en) | Collision position estimation device and machine learning device | |
CN115256468B (en) | A state detection and standing planning method for a humanoid robot after falling down | |
US11999051B2 (en) | Control device, control method, and program | |
KR101371655B1 (en) | System for detecting zero velocity interval and method for detecting zero velocity interval | |
Chavez et al. | Contact force and joint torque estimation using skin | |
CN116749196B (en) | Multi-axis mechanical arm collision detection system and method and mechanical arm | |
Subburaman et al. | Multi-sensor based fall prediction method for humanoid robots | |
US11407120B2 (en) | Control device, and control method | |
WO2020107279A1 (en) | Biped robot and moving method therefor, apparatus and storage medium | |
US20230213424A1 (en) | Test system with detection feedback | |
JP2019130325A5 (en) | Shoes, sensor units and programs | |
JP6947908B2 (en) | Methods and corresponding devices that support the movement of at least one user | |
CN117984363A (en) | Anti-collision protection method for small six-joint robot tool |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |