CN115984957A - Recognition method and system for detecting motion and gesture - Google Patents
Recognition method and system for detecting motion and gesture Download PDFInfo
- Publication number
- CN115984957A CN115984957A CN202211622928.4A CN202211622928A CN115984957A CN 115984957 A CN115984957 A CN 115984957A CN 202211622928 A CN202211622928 A CN 202211622928A CN 115984957 A CN115984957 A CN 115984957A
- Authority
- CN
- China
- Prior art keywords
- person
- walking
- detected
- key points
- knee joint
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 62
- 230000033001 locomotion Effects 0.000 title claims description 43
- 238000001514 detection method Methods 0.000 claims abstract description 77
- 210000000629 knee joint Anatomy 0.000 claims abstract description 45
- 230000008569 process Effects 0.000 claims abstract description 24
- 230000009471 action Effects 0.000 claims abstract description 12
- 210000002414 leg Anatomy 0.000 claims description 26
- 210000004394 hip joint Anatomy 0.000 claims description 20
- 238000004422 calculation algorithm Methods 0.000 claims description 9
- 230000008859 change Effects 0.000 claims description 9
- 210000001624 hip Anatomy 0.000 claims description 7
- 210000003423 ankle Anatomy 0.000 claims description 6
- 230000006870 function Effects 0.000 claims description 6
- 210000003127 knee Anatomy 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000013527 convolutional neural network Methods 0.000 claims description 3
- 238000013480 data collection Methods 0.000 claims description 3
- 238000010801 machine learning Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 claims description 3
- 238000012360 testing method Methods 0.000 claims 1
- 230000036544 posture Effects 0.000 abstract description 9
- 230000005021 gait Effects 0.000 abstract 1
- 238000004590 computer program Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 210000000544 articulatio talocruralis Anatomy 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
Images
Landscapes
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The application relates to an identification method and system for detecting actions and postures, which are used for predicting and judging the falling risk of the old, obtaining the step number, the maximum step length, the step speed, the knee joint moving angle, the ratio of the duration of a support phase to the duration of a swing phase in one gait cycle in the detection process in an image identification mode, and then obtaining the falling probability of the old through a falling risk probability formula of the old; according to the method and the device, the accuracy of obtaining the prediction parameters is improved through image recognition, the more parameters are used for predicting the falling risk, and the accuracy of the falling risk prediction of the old is improved.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to an identification method and system for detecting an action and a gesture.
Background
With the rapid development of economy, the world today is facing a serious challenge of aging population. The enormous number of elderly people and the rapid growth rate bring a series of problems to the development of the economic society.
In the prior art, a chinese patent of invention (CN 106539587A) discloses a technical scheme for identifying whether there is a fall risk by detecting actions and postures, which mainly uses 3 9-axis motion sensors, two of which are placed in two shoes, and one is placed on the waist, and obtains the original data of acceleration, angular velocity and angle through three sensors, and detects the fall state through the analysis of the motion sensors on various data signals; however, the above scheme does not have good effect of motion and gesture recognition because the motion information acquired by the touch sensor is less.
Therefore, there is a need in the art for an identification system for detecting motions and postures, which is used to improve the precision of motion and posture detection, so as to better evaluate the fall risk.
Disclosure of Invention
The technical problem to be solved by the application is to overcome the defects of the technical scheme, and the identification method and the identification system for detecting the actions and the postures are provided, so that a plurality of actions and posture indexes of the person to be detected are obtained in an image identification mode, and the accuracy of falling risk judgment of the person to be detected is improved.
In order to achieve the above object, according to one aspect of the present application, there is provided a recognition method for performing motion and gesture detection, including:
step 1: the person to be detected enters an action and posture detection area;
step 2: identifying key points of the body of a person to be detected, and tracking the key points;
and step 3: collecting the coordinates and time of key points of a person to be detected in the walking process of a detection area;
and 4, step 4: judging the state of the leg of the person to be detected when the person to be detected walks in the detection area according to the coordinates and the time of the key points, and counting the support phase duration and the swing phase duration in one walking period;
and 5: obtaining the advancing direction, the step number, the maximum step length, the pace and the knee joint movement angle of the person to be detected walking in the detection area according to the coordinates and the time of the key points;
step 6: inputting the support phase duration in one walking period, the swing phase duration in one walking period, the step number, the maximum step length, the step speed and the knee joint movement angle into an old person falling risk equation to obtain falling risks of the person to be detected;
and 7: and finishing the detection and outputting the judgment result of the step 7.
Specifically, the body key points are identified through a camera device, wherein the camera device pixel requires at least 720p and is fixedly placed at 0.8m height;
specifically, in step 2, the body key points are a left hip, a right hip, a left knee, a right knee, a left ankle, a right ankle, a left heel, a right heel, a left toe and a right toe;
further, the method is used for identifying key points in an image identification mode, specifically, the image identification method is one of an algorithm based on machine learning, an algorithm based on a convolutional neural network and an algorithm based on regression, and is used for identifying the body key points of the person to be detected;
further, tracking the key points specifically comprises the steps of obtaining coordinates of the key points and time of the key points in each coordinate when the person to be detected walks in the area to be detected in an image identification mode;
specifically, the coordinates are three-dimensional world coordinates;
specifically, the state of the leg of the person to be measured is divided into a support phase or a swing phase;
further, the step of judging the leg state of the person to be detected when the person to be detected walks in the detection area specifically comprises the following steps:
step 4.1: judging that the person to be tested is a supporting phase when the coordinates of the toe key point and the heel key point are not changed simultaneously;
step 4.2: determining the remaining time in the time to be measured as the swing phase of the person to be measured;
step 4.3: counting the duration of the support phase and the duration of the swing phase in a walking period;
specifically, the step 5 specifically includes:
step 5.1: obtaining the walking direction of the person to be tested by comparing the knee joint key points with the hip joint key points;
specifically, the step 5.1 is to determine the walking direction of the person to be measured by comparing x-axis coordinates of a knee joint key point and a hip joint key point;
and step 5.2: constructing a connecting line of coordinates of the hip joint key point and the knee joint key point at the same moment, and obtaining the walking steps of the person to be detected in the detection area by the times that the connecting line is vertical to the ground in the distance detection process;
step 5.3: calculating to obtain the maximum step length of the person to be measured according to the maximum angle and the leg length of the connecting line of the hip joint point and the knee joint key point coordinate in the change period;
specifically, the step 5.3 specifically includes calculating to obtain the maximum step length of the person to be measured through a trigonometric function;
furthermore, the maximum angle of a connecting line of the hip joint point and the knee joint key point coordinate at the same time in a change period is obtained by image recognition, and the leg length is obtained by measurement of medical staff before entering a detection area;
step 5.4: the two moments in the same period with the maximum angle obtained in the step 5.3 are differenced to obtain the time difference, and then the step length of each step is divided by the time difference to obtain the pace;
step 5.5: connecting the hip joint key point with the knee joint connection point, connecting the knee joint key point with the ankle joint key point, and obtaining the knee joint moving angle through the angle change of the two lines in the same period;
specifically, the fall risk formula for the elderly is as follows:
in the formula, a 1 、a 2 、a 3 、a 4 、a 5 、a 6 The fitting coefficient is a value range from-10 to 10;
α 1 is the number of steps in the detection process, alpha 2 Is the maximum step size, alpha, in the detection process 3 Is the step speed, alpha, in the detection process 4 To detect the angle of knee joint movement during the process, alpha 5 For the duration of the supporting phase, alpha, of a walking cycle 6 The duration of the swing phase in a walking period;
specifically, in the application, the setting of the fitting coefficient can be obtained by formula fitting according to a large amount of detection data and the falling risk; judging whether the person to be tested has a risk parameter according to an output result of the old person falling equation;
according to another aspect of the application, the method further comprises a recognition system for performing motion and gesture detection, wherein the recognition system adopts the above recognition method for motion and gesture detection, and comprises the following steps:
the key point identification module is used for identifying key points of the body of a person to be detected and tracking the key points;
the data collection module is used for collecting the coordinates and time of key points of a person to be detected in the walking process of the detection area;
the leg state counting module is used for judging the state of the leg of the person to be detected when the person to be detected walks in the detection area according to the coordinates and the time of the key points and counting the supporting phase duration and the swinging phase duration in one walking period;
the walking parameter calculation module is used for obtaining the walking advancing direction, the walking number, the maximum step length, the walking speed and the knee joint movement angle of the person to be detected in the detection area according to the coordinates and the time of the key points;
the falling risk judgment module is used for inputting the support phase duration in one walking period, the swing phase duration in one walking period, the step number, the maximum step length, the step speed and the knee joint movement angle into a falling risk equation of the old people to obtain falling risk parameters of the person to be detected;
and the result output module is used for outputting the judgment result.
According to another aspect of the present application, a computer-readable storage medium is also included, on which a data processing program is stored, the data processing program being executed by a processor to perform a recognition method for motion and gesture detection as described above.
Based on the technical scheme, the identification method for detecting the action and the gesture has the following technical effects:
1. in the detection process, the step number, the maximum step length, the step speed, the knee joint moving angle, the duration of the support phase in one walking period and the duration of the swing phase in one walking period in the detection process are obtained in an image recognition mode, and compared with the prior art, the method has the advantages that the accuracy of parameter acquisition is improved, the number of sensors is reduced, and the cost is saved by arranging the sensors on the body of a person to be detected and on a walking channel.
2. Compared with the prior art, the method and the device are used for judging the falling risk through only a plurality of indexes, are used for predicting the falling risk through more parameters, have wider coverage range of the parameters, cover multiple dimensions such as pace, joint angle and walking phase time, are more scientific in prediction, and improve the accuracy of the falling risk prediction of the old.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification. In the drawings:
FIG. 1 is a flow chart of a recognition method for performing motion and gesture detection provided by an embodiment of the present application;
fig. 2 is a flowchart for determining a leg state of the person to be detected when walking in the detection area according to the embodiment of the present application;
fig. 3 is a flowchart for obtaining the walking direction, the walking number, the maximum step length, the walking speed, and the knee joint movement angle of the person to be detected in the detection area according to the coordinates and the time of the key points.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the drawings of the embodiments of the present invention, and it is obvious that the described embodiments are some but not all of the embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments herein without making any creative effort, shall fall within the scope of protection. It should be noted that the embodiments and features of the embodiments may be arbitrarily combined with each other without conflict.
Unless the context clearly requires otherwise, throughout the description and the claims, the words "comprise", "comprising", and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is, what is meant is "including, but not limited to".
Example one
As shown in fig. 1, a recognition method for performing motion and gesture detection includes:
step 1: the person to be detected enters an action and posture detection area;
step 2: identifying key points of the body of a person to be detected, and tracking the key points;
specifically, the body key points are identified through a camera device, wherein the camera device pixel requires at least 720p and is fixedly placed at 0.8m height;
specifically, the body key points are a left hip, a right hip, a left knee, a right knee, a left ankle, a right ankle, a left heel, a right heel, a left toe, and a right toe;
further, the method is used for identifying key points in an image identification mode, specifically, the image identification method is one of an algorithm based on machine learning, an algorithm based on a convolutional neural network and an algorithm based on regression, and is used for identifying the body key points of the person to be detected;
further, tracking the key points specifically includes obtaining coordinates of the key points and time of the key points in each coordinate when the person to be detected walks in the area to be detected in an image recognition mode;
and step 3: collecting the coordinates and time of key points of a person to be detected in the walking process of a detection area;
specifically, the coordinates are three-dimensional world coordinates;
and 4, step 4: judging the state of the leg of the person to be detected when the person to be detected walks in the detection area according to the coordinates and the time of the key points, and counting the support phase duration and the swing phase duration in one walking period;
specifically, the state of the leg of the person to be measured is divided into a support phase or a swing phase;
further, the step of judging the leg state of the person to be detected when walking in the detection area specifically comprises:
step 4.1: judging that the person to be tested is a supporting phase when the coordinates of the toe key point and the heel key point are not changed simultaneously;
step 4.2: determining the remaining time in the time to be measured as the swing phase of the person to be measured;
as is known, when a person walks, the legs generally have only two states, namely a support state and a swing state, and when the legs are not in the support state, the legs are in the swing state, so that the legs of the person to be measured are determined to be in the swing phase inevitably when the legs of the person to be measured are not in the support phase;
step 4.3: counting the duration of the support phase and the duration of the swing phase in a walking period;
and 5: obtaining the walking direction, the step number, the maximum step length, the step speed and the knee joint movement angle of the person to be detected walking in the detection area according to the coordinates and the time of the key points;
specifically, the step 5 specifically includes:
step 5.1: obtaining the walking direction of the person to be tested by comparing the knee joint key points with the hip joint key points;
specifically, the step 5.1 is to determine the walking direction of the person to be tested by comparing x-axis coordinates of the key points of the knee joint and the hip joint;
exemplarily, at time t1, the coordinates of the knee joint key point are (20, y1, z 1), at this time, the coordinates of the hip joint key point are (16, y2, z 2), and at this time, the walking direction of the person to be measured is the direction of the x-axis arrow;
step 5.2: constructing a connecting line of coordinates of the hip joint key point and the knee joint key point at the same moment, and obtaining the walking steps of the person to be detected in the detection area by the times that the connecting line is vertical to the ground in the distance detection process;
specifically, in the prior art, generally, a sensor is arranged on the ground of a detection area to detect the step number of the person to be detected, and this scheme requires a large number of sensors to be laid in the detection area, which is undoubtedly a huge cost, so that in this embodiment, the step number is determined by determining the number of times that the knee joint and the hip joint are perpendicular to the ground by using an image recognition method, which not only saves the cost caused by laying a sensor on the ground, but also avoids the influence on the precision caused by the detection precision of the sensor, and improves the determination precision of the step number;
step 5.3: calculating to obtain the maximum step length of the person to be measured according to the maximum angle and the leg length of the connecting line of the hip joint point and the knee joint key point coordinate in the change period;
specifically, the step 5.3 specifically includes calculating to obtain the maximum step length of the person to be measured through a trigonometric function;
furthermore, the maximum angle of the connecting line of the hip joint point and the knee joint key point coordinate at the same time in the change period is obtained by image recognition, and the leg length is obtained by measurement of medical staff before entering the detection area, so that the maximum step length can be easily obtained at low cost, and the condition that the detection of the staff to be detected is inaccurate due to high psychological pressure caused by arranging too many sensors on the staff to be detected is avoided;
step 5.4: the two moments in the same period with the maximum angle obtained in the step 5.3 are differenced to obtain the time difference, and then the step length of each step is divided by the time difference to obtain the pace;
step 5.5: connecting the hip joint key point with the knee joint connection point, connecting the knee joint key point with the bare joint key point, and obtaining the knee joint moving angle through the angle change of the two lines in the same period;
therefore, in the step 5, the advancing direction, the step number, the maximum step length, the step speed and the knee joint movement angle are obtained through the method, and the parameters are used for detecting the action and the posture of the person to be detected, so that whether the person to be detected has a falling risk is judged, the detected indexes are more comprehensive compared with the prior art, and the judgment precision of the falling risk is improved;
step 6: inputting the support phase duration in one walking period, the swing phase duration in one walking period, the step number, the maximum step length, the step speed and the knee joint movement angle into a falling risk equation of the old to obtain falling risk parameters of the person to be detected;
specifically, the fall risk formula for the elderly is as follows:
in the formula, a 1 、a 2 、a 3 、a 4 、a 5 、a 6 The fitting coefficient is a value range from-10 to 10;
α 1 is the number of steps in the detection process, alpha 2 Is the maximum step size, alpha, in the detection process 3 Is the pace of the detection process, α 4 To detect the angle of knee joint movement during the process, alpha 5 For the duration of the supporting phase, alpha, of a walking cycle 6 Is a rowThe duration of the swing phase within the travel period;
specifically, in this embodiment, the setting of the fitting coefficient may be obtained by formula fitting according to a large amount of detection data and the fall risk; judging whether the person to be detected has the risk of falling according to the output result of the falling equation of the old;
and 7: and finishing the detection and outputting the judgment result of the step 7.
Example two
The application further includes a recognition system for performing motion and gesture detection, where the recognition system employs the motion and gesture detection recognition method in the first embodiment, and the recognition system includes:
the key point identification module is used for identifying key points of the body of a person to be detected and tracking the key points;
the data collection module is used for collecting the coordinates and time of key points of a person to be detected in the walking process of the detection area;
the leg state counting module is used for judging the state of the leg of the person to be detected when the person to be detected walks in the detection area according to the coordinates and the time of the key points and counting the supporting phase duration and the swinging phase duration in one walking period;
the walking parameter calculation module is used for obtaining the walking advancing direction, the step number, the maximum step length, the step speed and the knee joint movement angle of the person to be detected in the detection area according to the coordinates and the time of the key points;
the falling risk judgment module is used for inputting the support phase duration in one walking period, the swing phase duration in one walking period, the step number, the maximum step length, the step speed and the knee joint movement angle into a falling risk equation of the old people to obtain falling risk parameters of the person to be detected;
and the result output module is used for outputting the judgment result.
EXAMPLE III
The present embodiment includes a computer-readable storage medium having a data processing program stored thereon, the data processing program being executed by a processor to perform the recognition method for motion and gesture detection according to the first embodiment.
As will be appreciated by one of skill in the art, the embodiments herein may be provided as a method, apparatus (device), or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied in the medium. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, including, but not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer, and the like. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices) and computer program products according to embodiments herein. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments herein have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following appended claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of this disclosure.
It will be apparent to those skilled in the art that various changes and modifications may be made herein without departing from the spirit and scope thereof. Thus, it is intended that such changes and modifications be included herein, provided they come within the scope of the appended claims and their equivalents.
Claims (10)
1. A recognition method for performing motion and gesture detection, characterized by: the method comprises the following steps:
step 1: the person to be detected enters an action and posture detection area;
step 2: identifying key points of the body of a person to be detected, and tracking the key points;
and step 3: collecting the coordinates and time of key points of a person to be detected in the walking process of a detection area;
and 4, step 4: judging the state of the leg of the person to be detected when walking in the detection area according to the coordinates and time of the key points, and counting the support phase duration and the swing phase duration in one walking period;
and 5: obtaining the walking direction, the step number, the maximum step length, the step speed and the knee joint movement angle of the person to be detected walking in the detection area according to the coordinates and the time of the key points;
step 6: inputting the support phase duration in one walking period, the swing phase duration in one walking period, the step number, the maximum step length, the step speed and the knee joint movement angle into a falling risk formula of the old to obtain falling risk parameters of the person to be detected;
and 7: and finishing the detection and outputting the judgment result of the step 7.
2. A recognition method for performing motion and gesture detection according to claim 1, wherein in step 2, the body key points are left hip, right hip, left knee, right knee, left ankle, right ankle, left heel, right heel, left toe and right toe.
3. An identification method for motion and gesture detection according to claim 1, characterized in that it is used for identification of key points by means of image identification, in particular, the image identification method is one of machine learning algorithm, convolutional neural network algorithm, regression algorithm for identification of body key points of the person under test.
4. An identification method for performing action and posture detection according to claim 1, wherein the judging of the state of the legs of the person to be detected while walking in the detection area is specifically:
step 4.1: judging that the person to be tested is a support phase when the coordinates of the toe key point and the heel key point are not changed simultaneously;
step 4.2: determining the remaining time in the time to be measured as the swing phase of the person to be measured;
step 4.3: and counting the duration of the support phase and the duration of the swing phase in one walking period.
5. A recognition method for performing motion and gesture detection according to claim 1, wherein said step 5 specifically comprises:
step 5.1: obtaining the walking direction of the person to be tested by comparing the knee joint key points with the hip joint key points;
and step 5.2: constructing a connecting line of coordinates of the hip joint key point and the knee joint key point at the same moment, and obtaining the walking steps of the person to be detected in the detection area by the times that the connecting line is vertical to the ground in the distance detection process;
step 5.3: calculating to obtain the maximum step length of the person to be measured according to the maximum angle and the leg length of the connecting line of the hip joint point and the knee joint key point coordinate in the change period;
step 5.4: the two moments in the same period with the maximum angle obtained in the step 5.3 are differenced to obtain the time difference, and then the step length of each step is divided by the time difference to obtain the pace;
step 5.5: connecting the hip joint key point with the knee joint connection point, connecting the knee joint key point with the bare joint key point, and obtaining the knee joint moving angle through the angle change of the two lines in the same period.
6. An identification method for performing motion and posture detection as claimed in claim 5, characterized in that the step 5.1 is to determine the walking direction of the person to be detected by comparing x-axis coordinates of key points of knee joint and hip joint.
7. The identification method for performing motion and posture detection according to claim 5, wherein the step 5.3 specifically includes obtaining the maximum step length of the person to be detected through trigonometric function calculation, the maximum angle of the connecting line of the hip joint point and the knee joint key point coordinate at the same time in the change period is obtained through image recognition, and the leg length is obtained through measurement by medical staff before entering the detection area.
8. An identification method for action and gesture detection according to claim 5, characterized in that the elderly fall risk formula is:
in the formula, a 1 、a 2 、a 3 、a 4 、a 5 、a 6 The fitting coefficient is a value range from-10 to 10; alpha (alpha) ("alpha") 1 Is the number of steps in the detection process, alpha 2 Is the maximum step size, alpha, in the detection process 3 Is the pace of the detection process, α 4 To detect the angle of knee joint movement during the process, alpha 5 Duration of a supporting phase, α, for a walking cycle 6 Is the duration of the swing phase within one walking cycle.
9. A recognition system for performing motion and gesture detection, the recognition system employing the recognition method for motion and gesture detection of any one of claims 1-8, further comprising:
the key point identification module is used for identifying key points of the body of a person to be detected and tracking the key points;
the data collection module is used for collecting the coordinates and time of key points of a person to be detected in the walking process of the detection area;
the leg state counting module is used for judging the state of the leg of the person to be detected when the person to be detected walks in the detection area according to the coordinates and the time of the key points, and counting the support phase duration and the swing phase duration in one walking period;
the walking parameter calculation module is used for obtaining the walking advancing direction, the walking number, the maximum step length, the walking speed and the knee joint movement angle of the person to be detected in the detection area according to the coordinates and the time of the key points;
a falling risk judgment module for inputting the support phase duration in one walking period, the swing phase duration in one walking period, the step number, the maximum step length, the step speed and the knee joint movement angle into a falling risk equation of the old person to obtain falling risk parameters of the person to be detected;
and the result output module is used for outputting the judgment result.
10. A computer readable storage medium having stored thereon a data processing program for execution by a processor of a recognition method for motion and gesture detection according to any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211622928.4A CN115984957A (en) | 2022-12-16 | 2022-12-16 | Recognition method and system for detecting motion and gesture |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211622928.4A CN115984957A (en) | 2022-12-16 | 2022-12-16 | Recognition method and system for detecting motion and gesture |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115984957A true CN115984957A (en) | 2023-04-18 |
Family
ID=85975156
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211622928.4A Pending CN115984957A (en) | 2022-12-16 | 2022-12-16 | Recognition method and system for detecting motion and gesture |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115984957A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118512171A (en) * | 2024-05-21 | 2024-08-20 | 华南理工大学 | Senile weakness risk assessment method and system based on gait analysis |
-
2022
- 2022-12-16 CN CN202211622928.4A patent/CN115984957A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118512171A (en) * | 2024-05-21 | 2024-08-20 | 华南理工大学 | Senile weakness risk assessment method and system based on gait analysis |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Dikovski et al. | Evaluation of different feature sets for gait recognition using skeletal data from Kinect | |
US9183431B2 (en) | Apparatus and method for providing activity recognition based application service | |
CN101398890B (en) | Person determination device | |
CN107578019B (en) | A gait recognition system and recognition method based on visual and tactile fusion | |
CN109145696B (en) | Old people falling detection method and system based on deep learning | |
CN111383221B (en) | A method and computer equipment for generating a scoliosis detection model | |
CN102499687B (en) | Pig respirator rate detecting method and device on basis of machine vision | |
CN112668549B (en) | Pedestrian attitude analysis method, system, terminal and storage medium | |
CN115569344B (en) | Standing long jump performance evaluation method, device, electronic equipment and storage medium | |
CN111597975B (en) | Personnel action detection method and device and electronic equipment | |
CN111288986A (en) | Motion recognition method and motion recognition device | |
CN110728754A (en) | Rigid body mark point identification method, device, equipment and storage medium | |
CN115984957A (en) | Recognition method and system for detecting motion and gesture | |
WO2017158569A1 (en) | Kinect based balance analysis using single leg stance (sls) exercise | |
CN105491307A (en) | Depth sensor system | |
CN113303789B (en) | Gait event detection method and device based on acceleration | |
CN112438722B (en) | Sarcopenia assessment device and storage medium | |
CN103854026B (en) | A kind of recognition methods and electronic equipment | |
CN108903947A (en) | Gait analysis method, gait analysis device and readable storage medium storing program for executing | |
CN117109567A (en) | Riding gesture monitoring method and system for dynamic bicycle movement and wearable riding gesture monitoring equipment | |
CN112741617A (en) | CSI-based omnidirectional gait detection algorithm | |
CN114358214B (en) | Gait adaptive recognition method and device, storage medium and terminal | |
CN112734886B (en) | Biological model penetration detection method, device, electronic device and storage medium | |
WO2023108498A1 (en) | Zero-speed interval detection method, pedestrian navigation system and storage medium | |
CN115204221A (en) | Method and device for detecting physiological parameters and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |