CN118452897A - Gait detection, motion recognition and fatigue detection methods, devices and storage medium - Google Patents
Gait detection, motion recognition and fatigue detection methods, devices and storage medium Download PDFInfo
- Publication number
- CN118452897A CN118452897A CN202410547431.3A CN202410547431A CN118452897A CN 118452897 A CN118452897 A CN 118452897A CN 202410547431 A CN202410547431 A CN 202410547431A CN 118452897 A CN118452897 A CN 118452897A
- Authority
- CN
- China
- Prior art keywords
- gait
- emg
- local extremum
- event
- motion data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000005021 gait Effects 0.000 title claims abstract description 311
- 238000001514 detection method Methods 0.000 title claims abstract description 101
- 238000000034 method Methods 0.000 claims abstract description 26
- 230000011218 segmentation Effects 0.000 claims abstract description 10
- 230000008569 process Effects 0.000 claims abstract description 9
- 238000002567 electromyography Methods 0.000 claims description 67
- 238000001228 spectrum Methods 0.000 claims description 28
- 210000003205 muscle Anatomy 0.000 claims description 24
- 230000008859 change Effects 0.000 claims description 21
- 230000003044 adaptive effect Effects 0.000 claims description 13
- 230000006978 adaptation Effects 0.000 claims description 12
- 210000003141 lower extremity Anatomy 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 10
- 230000001133 acceleration Effects 0.000 claims description 9
- 230000000295 complement effect Effects 0.000 claims description 7
- 238000004891 communication Methods 0.000 claims description 3
- 210000002683 foot Anatomy 0.000 description 34
- 210000002414 leg Anatomy 0.000 description 27
- 230000000875 corresponding effect Effects 0.000 description 25
- 238000004458 analytical method Methods 0.000 description 11
- 206010049565 Muscle fatigue Diseases 0.000 description 10
- 230000000694 effects Effects 0.000 description 8
- 238000005259 measurement Methods 0.000 description 7
- 238000012360 testing method Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 210000003414 extremity Anatomy 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 244000309466 calf Species 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000000386 athletic effect Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 210000000629 knee joint Anatomy 0.000 description 2
- 230000004118 muscle contraction Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 210000000689 upper leg Anatomy 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000005311 autocorrelation function Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 238000013481 data capture Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 210000004744 fore-foot Anatomy 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 210000001872 metatarsal bone Anatomy 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 210000003314 quadriceps muscle Anatomy 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/725—Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/10—Pre-processing; Data cleansing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/2431—Multiple classes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/10—Athletes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Theoretical Computer Science (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Physiology (AREA)
- Evolutionary Computation (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Psychiatry (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Geometry (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention provides a gait detection, motion recognition and fatigue detection method, a device and a storage medium, wherein gait motion data are acquired; identifying a gait event from the acquired gait motion data; performing gait cycle segmentation on the identified gait event; gait detection information is extracted from each of the divided gait cycles. The method of the embodiment only needs to collect gait motion data by using the sensor worn on the leg, and analyzes and processes the gait motion data, thereby obtaining the gait detection result. The gait detection apparatus provided by the embodiment is portable, wireless, small and exquisite, and can measure the sensor system of shank motion and surface EMG simultaneously, and convenient to implement, easy to operate is applicable to miniaturized and daily gait detection.
Description
Technical Field
The invention relates to the technical field of sports science, in particular to a gait detection, movement identification and fatigue detection method, a device and a storage medium.
Background
In the field of sports science, gait analysis generally uses sensors to collect real-time data during walking or running, and analyzes the real-time data to obtain gait feature analysis. And further, the gait feature analysis is utilized to identify and quantify the asymmetry of gait or to evaluate the stability and efficiency of walking.
Existing gait analysis requires specialized equipment and environments, such as: the standard exercise data acquisition system is required and the exercise along the designated route is required to acquire accurate exercise data, so that the exercise data acquisition system is not suitable for small-scale operation or daily measurement and cannot meet the daily requirements of users.
Disclosure of Invention
In view of the shortcomings of the prior art, the invention aims to provide a gait detection, motion recognition and fatigue detection method, device and storage medium, which overcome the defect that the gait detection in the prior art is influenced by equipment and environment and cannot realize daily detection of gait.
The technical scheme of the invention is as follows:
in a first aspect, the present embodiment provides a gait detection method, including:
Acquiring gait motion data;
identifying a gait event from the acquired gait motion data;
performing gait cycle segmentation on the identified gait event;
gait detection information is extracted from each of the divided gait cycles.
Optionally, the step of identifying a gait event from the acquired gait motion data comprises:
The local extremum and the movement direction change of the angular velocity are identified from the gait movement data, and the gait event is identified according to the time point and the sequence of occurrence of the local extremum and the movement direction change time point of the angular velocity.
Optionally, the gait event comprises a plurality of gait phases; the gait phase is: a heel strike stage, a sole flat stage, a foot pushing stage and a swing stage;
The step of identifying the gait event according to the time point of occurrence of the local extremum, the sequence of occurrence and the time point of change of the movement direction of the angular velocity comprises the following steps:
Determining a swing stage in a gait event according to a time point when the first local extremum occurs in the angular velocity;
determining a heel strike stage in a gait event according to a time point corresponding to the change of the movement direction after the first local extremum of the angular velocity;
Determining the sole flat-laying stage in a gait event according to the time point when the second local extremum of the angular velocity appears after the movement direction changes;
and determining the foot pushing stage in the gait event according to the time point when the third local extremum of the angular velocity appears after the second local extremum.
Optionally, the local extremum is represented as a maximum or minimum point having an absolute amplitude above a fixed, adaptive or normalized threshold;
The second and third local extremum are represented as the maximum or minimum points occurring within an adaptation period, or points occurring at the end of the adaptation period;
the adaptive period of the second local extremum begins at the moment of the heel strike phase and continues to be proportional to the time of the swing phase of the previous step.
The adaptation period of the third local extremum starts from the moment of the sole lay-down phase up to the change of the direction of movement of the angular velocity.
Optionally, the step of identifying a gait event from the acquired gait motion data specifically includes: and calculating and processing the gait motion data by using a finite state machine, and determining the gait stage of each frame in the motion process corresponding to the gait motion data.
Optionally, the gait detection information includes: space-time parameters and kinematic parameters;
the step of extracting gait detection information from each of the divided gait cycles includes:
Calculating space-time parameters according to corresponding time points, stride time and stride length between gait phases in the gait cycle;
and calculating according to the accelerometer signals and the gyroscope signals contained in the gait motion data to obtain the kinematic parameters.
Optionally, the step of calculating the space-time parameter according to the corresponding time point, the stride time and the stride length between the gait phases in the gait cycle includes:
Calculating to obtain a time gait parameter according to corresponding time points among gait phases in a gait event;
The spatial gait parameters are calculated from acceleration signals measured between foot rest gait events in two consecutive gaits.
Optionally, the step of calculating the kinematic parameter according to the accelerometer signal and the gyroscope signal included in the gait motion data includes:
the inclination angle of the lower limb body part is calculated by processing the accelerometer signal and the gyroscope signal using a complementary filter or a kalman filter, and the kinematic parameter is calculated according to the difference of the calculated inclination angles between the adjacent lower limb body parts.
In a second aspect, the present embodiment provides a gait recognition and fatigue detection method based on gait detection information, including:
respectively obtaining gait detection information of the two legs by using the gait detection method;
calculating to obtain a symmetry index according to gait detection information corresponding to the two legs, and determining a gait recognition result according to the symmetry index;
Respectively acquiring EMG data corresponding to each divided gait cycle by using an EMG sensor;
And determining a fatigue detection result of the muscle according to the acquired EMG data and/or a power spectrum corresponding to the EMG data.
Optionally, the step of determining the fatigue detection result of the muscle according to the acquired EMG data and/or the power spectrum corresponding to the EMG data includes:
determining and obtaining an RMS value according to EMG data; wherein the RMS value is the square root of the arithmetic mean of the squares of a set of EMG data in a moving window;
converting the EMG data in the time domain into a frequency domain to obtain a power spectrum of the EMG;
Calculating according to the power spectrum of the EMG to obtain an MNF value and an MDF value; wherein the MNF value is an average frequency calculated by dividing the sum of products of the power spectrum and the frequency of the EMG by the sum of the power spectrum; the MDF value is the frequency for dividing the EMG power spectrum into two constant-amplitude areas, so as to obtain half of the total power;
And determining a fatigue detection result of the muscle according to one or more of the RMS value, the MNF value and the MDF value.
In a third aspect, the present embodiment further provides a gait detection apparatus, including: the system comprises an inertial sensor, an electromyography sensor, a processor and a memory, wherein the processor is in communication connection with both the inertial sensor and the electromyography sensor;
The inertial sensor and the electromyography sensor are fixed on the surface of the skin and are used for acquiring gait motion data and sending the gait motion data to the processor;
the processor is compiled to implement the steps of the gait detection method when executing the gait detection program stored in the memory.
In a fourth aspect, the present embodiment further discloses a computer storage medium, where the computer readable storage medium stores one or more programs, and the one or more programs are executable by one or more processors to implement the steps of the gait detection method.
The beneficial effects are that: the invention provides a gait detection, motion recognition and fatigue detection method, a device and a storage medium, wherein gait motion data are acquired; identifying a gait event from the acquired gait motion data; performing gait cycle segmentation on the identified gait event; gait detection information is extracted from each of the divided gait cycles. The method of the embodiment only needs to collect gait motion data by using the sensor worn on the leg, and analyzes and processes the gait motion data so as to obtain the gait detection result, so that the method is convenient to implement, simple to operate and suitable for miniaturization and daily gait detection.
Drawings
FIG. 1 is a flow chart of steps of a gait detection method according to the invention;
FIG. 2 is a graph of raw sagittal plane gyroscope signals collected from the lower leg and foot, respectively, during running in an embodiment of the present invention;
FIG. 3 is a graph of filtered sagittal plane gyroscope signals collected from the lower leg and foot, respectively, in a running test in an embodiment of the present invention;
FIG. 4 is a graph of filtered sagittal plane gyroscope signals collected from the lower leg and foot, respectively, during a slow walk test in an embodiment of the present invention;
FIG. 5 is a graph of filtered sagittal plane gyroscope signals collected from the left and right lower legs, respectively, during running in an embodiment of the present invention;
FIG. 6 is a graph of filtered sagittal plane gyroscope signals collected from left and right lower legs, respectively, in a slow walking test of an embodiment of the present invention;
FIG. 7 is a graph of left and right knee joint angles calculated from the difference in inclination angle between the lower leg and the body part of the foot during running in an embodiment of the present invention;
FIG. 8 is a graph of left and right knee joint angles calculated from the difference in inclination angle between the lower leg and the body part of the foot in a slow walking test in accordance with an embodiment of the present invention;
FIG. 9 is a schematic representation of an exemplary EMG of lower limb muscles in a slow walking test in accordance with the present invention;
FIG. 10 is a flow chart of steps of a method for motion recognition and fatigue detection according to an embodiment of the present invention;
fig. 11 is a schematic block diagram of a device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more clear and clear, the present invention will be further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein includes all or any element and all combination of one or more of the associated listed items.
It will be understood by those skilled in the art that all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs unless defined otherwise. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
In the field of healthcare and rehabilitation, gait detection can be used to determine gait abnormalities, check deviations from normal gait patterns. And the gait analysis of the two legs can identify and quantify the asymmetry of the gait and evaluate the dynamic stability and efficiency of walking. In the field of sports science, precious gait pattern data can be provided to running coaches, sports scientists and researchers by analyzing factors such as stride length, standing time, stride frequency, joint angle, surface Electromyography (EMG) and the like to evaluate and monitor gait performances of patients and athletes and identify potential for training improvement.
In prior art gait detection, motion data capture of a person is typically performed using a standard motion capture system, and requires the person to move on a designated path or treadmill, such as by making reflective markers on the person, embedding multiple EMG electrodes on the leg muscles, and installing an infrared camera on the designated path or treadmill. Due to the strict requirements on data acquisition equipment and acquisition scenes, the current method is not suitable for small-scale operation and daily measurement, and therefore common sportsmen cannot accurately detect gait through data acquired through free movement.
In addition, conventional gait segmentation techniques generally use foot switches to identify stance and swing phases by foot weight bearing patterns. However, since a plurality of pressure sensors need to be placed at specified positions to ensure that the sensors do not move during intense exercise, the foot switch is difficult to use in practical applications, and it is difficult to obtain a good gait dividing effect. And the user aiming at different foot sizes still needs to be adjusted and calibrated, so that the placement difficulty of the pressure sensor is increased, and the accuracy of data acquisition is reduced.
In order to overcome the problems and realize accurate detection of gait, the embodiment provides a gait detection method based on a wearable sensor, a motion recognition and fatigue detection method and a system based on gait detection information, which acquire gait data by using the wearable sensor and analyze the acquired gait motion data to realize accurate detection of gait.
The present invention is further explained more precisely by the method and system of the present embodiment.
As shown in fig. 1, the present embodiment discloses a gait detection method, which includes:
step S1, gait motion data are acquired.
In this embodiment, the wearable sensor is used to collect gait movement data during long-distance walking or running measurement, specifically, the wearable sensor may be located on a shoelace of a sportsman, so as to monitor data during walking, jogging or cycling, and the inertial sensor may measure movement data and movement direction of a sportsman's foot.
In one embodiment, the athletic personnel may wear the wearable sensors on the thigh, calf and foot and make an orientation adjustment. Specifically, the orientation adjustment of the wearable sensor includes: ensuring that the three axes of the sensor are aligned with the leg segments, for example: the three axes of the inertial sensor are aligned with the leg segments, the X-axis is parallel to the axis of rotation of the body joint, i.e. the lateral/medial direction of the lower limb, and perpendicular to the YZ plane, which is parallel to the plane of motion; the Y-axis is aligned with the length of the body segment so that when the user stands upright, the Y-axis is parallel to the gravity line; the Z axis is perpendicular to the X and Y axes and is directed forward when the athlete is running or walking. During movement, as the user's body part moves, the accelerometer in the inertial sensor will measure both the acceleration of the body part due to gravity and the movement data of the body, while the gyroscope in the inertial sensor will measure the angular velocity of the proximal body joint of the body part. The gyroscope may be attached to the leg section or the foot section.
In addition, the wearable sensor may be secured to the skin surface of the thigh and calf of the athlete using plastic wrap and straps. An opening is provided in the package to allow the two electrodes at the bottom of the wearable sensor to contact the skin surface to measure the potential difference across the skin surface between the two electrodes. The potential difference corresponds to the activity of the underlying leg muscle, i.e. EMG. The plastic package may also be provided with a hanging handle so that the strap may be passed through and fastened to the body part using the velcro-ring attachment. The wearable sensor may be secured to the foot portion by tying a plastic wrapped hanging handle to the user's shoe with a lace.
When the wearable sensor is secured to the lower leg or foot of an athlete, the IMU in the wearable sensor measures the linear acceleration and angular velocity of the lower leg and foot in the sagittal plane, i.e., rotation about the Z-axis, during running or walking of the athlete. Typical raw gyroscope signal outputs for the lower leg and foot are shown in fig. 2.
Step S2, identifying gait events from the acquired gait motion data.
Since the gait of the exercising person is a periodic repetitive motion, each cycle may be referred to as a gait cycle, and each gait cycle comprises a series of gait phases. For example: a support phase and a wobble phase. Gait cycles can be further divided into more specific gait phases according to function: such as initial contact, load reaction, mid-support, final support, pre-swing, initial swing, mid-swing, and final swing.
When gait motion data is acquired in step S1, the data is first processed and filtered to remove high frequency components therein, and then identified as gait events.
In practice, the acquired gait motion data first needs to be processed, and since the original gyroscope signal output contains high frequency noise, it can be filtered out using a low pass filter. Butterworth filtering may be implemented by the differential equation of an IIR filter.
The gyroscope signal output should therefore be scanned in the linear time direction using a moving window, and all calculations are done frame by frame.
The second order butterworth low pass filter can be implemented using the following differential equation:
y(n)=b0·x(n)+b1·x(n-1)+b2·x(n-2)+a1·y(n-1)+a2·y(n-2);
Wherein, b1=2·b0;b2=b0;a1=2·b0·(λ2-1);
X (n) is the gyroscope signal input in the nth frame data and y (n) is the filtered gyroscope signal output.
In one implementation, a second order Butterworth filter with a cut-off frequency of 3 Hz is used, and typical filtered gyroscope signal outputs for the lower leg and foot during running exercise are shown in FIG. 3.
Specifically, the method comprises the following steps: the local extremum and the movement direction change of the angular velocity are identified from the gait movement data, and the gait event is identified according to the time point and the sequence of occurrence of the local extremum and the movement direction change time point of the angular velocity.
Wherein the local extremum can be detected by a first derivative test. The maximum point may be identified where the first derivative changes from positive to negative, and the minimum point may be identified where the first derivative changes from negative to positive. The maximum or minimum point should have an absolute magnitude above a constant, adaptive or normalized threshold. The threshold value may be determined by averaging the gyroscope signal over a time window during the ball lay.
In particular, the gait event may be divided into four gait phases in succession, including a heel strike phase, a ball lay phase, a foot push phase and a swing phase.
During the swing phase, the foot or leg swings forward, which can be detected by the inertial sensor as a backward rotation of the foot or leg in the sagittal plane, which is represented by the first local extremum of the angular velocity exceeding a predefined threshold in the negative region, defined as the swing phase (S for short, or swing phase).
After the swing phase, at initial contact, the heel strikes the ground, and then the heel rotates forward until the sole lies flat, which can be detected by the inertial sensor as a forward rotation of the foot or leg in the sagittal plane and is represented by a change in polarity of the gyroscope, defined as the heel strike phase (abbreviated as H).
After the heel strike phase, a second local extremum, defined as the ball flat phase (F), occurs in the positive area exceeding a predefined threshold.
In the final support phase, the foot is often lifted by the leg, rotating about the metatarsal heads as axes, such as with the forefoot rocking, which is known as pedaling before rocking forward. This action can be detected by the inertial sensor as a forward rotation of the other foot or leg in the sagittal plane, which exhibits a third local extremum of angular velocity, defined as the pushing phase (P). The pushing phase should take place during a certain proportion of the time from the heel strike point in time plus the last swing time until a subsequent change in the direction of angular velocity movement.
The step of identifying the gait event according to the time point and the sequence of occurrence of the local extremum and the time point of change of the movement direction of the angular velocity in the step comprises the following steps:
and S21, determining a swing stage in the gait event according to the time point of the first local extremum of the angular velocity.
Step S22, determining the heel strike stage in the gait event according to the time point corresponding to the movement direction change after the first local extremum of the angular velocity.
Step S23, determining the sole flat-laying stage in the gait event according to the time point of the second local extremum of the angular velocity after the movement direction is changed.
And step S24, determining a foot pushing stage in the gait event according to the time point of occurrence of the third local extremum of the angular velocity after the second local extremum.
The second and third local extremum are represented as the maximum or minimum points occurring within an adaptation period, or points occurring at the end of the adaptation period; the adaptive period of the second local extremum begins at the moment of the heel strike phase and continues to be proportional to the time of the swing phase of the previous step. The adaptation period of the third local extremum starts from the moment of the sole lay-down phase up to the change of the direction of movement of the angular velocity.
The step of identifying the gait event from the acquired gait motion data specifically includes: the finite state machine is used for calculating and processing the gait motion data, and the gait stage of each frame in the motion process corresponding to the gait motion data is determined, so that the minimum delay can be realized, and the gait motion data is performed in a real-time and online mode. The gait phase described above can be marked when the moving window scans the gyroscope signal.
Fig. 4 shows a similar detection of a gait event during slow walking. As can be seen from these fig. 4, the supporting phase and the swinging phase can be highlighted and divided accordingly. As can also be seen in fig. 4, the placement of the inertial sensors on the lower leg or foot has no significant effect on the detection of gait events.
Step S3, performing gait cycle segmentation on the identified gait event.
After a gait event is identified, a gait cycle segmentation may be performed based on the identified gait event. Since the above four gait phases are repeated every gait cycle, the gait cycle can be divided according to the time at which the four gait phases occur.
In the segmentation of the gait cycle, the second and third local extrema may be represented as the above-mentioned maximum or minimum points occurring within an adaptive period of time, or as points occurring at the end of the adaptive period of time. The adaptive period of the second local extremum starts from the moment of the heel-strike step phase and continues to be proportional to the swing time of the previous step. The adaptive period of the third local extremum begins at the moment of the foot flat gait phase and varies in direction of movement up to the angular velocity. The order of the gait phase instance detection steps is not critical or can be performed simultaneously, where appropriate.
Step S4, extracting gait detection information from each divided gait cycle.
Further, the gait detection information includes: space-time parameters and kinematic parameters;
the step of extracting gait detection information from each of the divided gait cycles includes:
Calculating space-time parameters according to corresponding time points, stride time and stride length between gait phases in the gait cycle; specifically, the step of calculating the space-time parameter according to the corresponding time point, the stride time and the stride length between the gait phases in the gait cycle includes:
and calculating according to the corresponding time points among the gait phases in the gait event to obtain the time gait parameters. The spatial gait parameters are calculated from acceleration signals measured between foot rest gait events in two consecutive gaits.
Because gait events for the left and right legs (running tasks as shown in fig. 5 and jogging tasks as shown in fig. 6) can be identified simultaneously, the detected gait events can be used as reference time points to calculate the spatiotemporal gait parameters.
The following are examples of some temporal gait parameters:
standing time = t TO(k)-tHS (k);
wobble time = t HS(k+1)-tTO (k);
stride time = t HS(k+1)-tHS (k);
wherein t TO (k) and t HS (k) are the points in time of toe off and heel strike, respectively, at the kth step.
If the accelerometer of the inertial sensor in the wearable sensor is initially aligned with the global reference frame and the gravity line, then any deviation in angular velocity ω from rotation in the Y-Z plane along the X-axis, at an angle θ that is then offset from the global frame, can be represented by a rotation matrix R for projecting the accelerometer data to the global frame. To calculate the spatial gait parameters, the gravitational acceleration should first be removed from the filtered accelerometer signal to obtain the linear acceleration of the sensor in the global frame a g by subtracting-1 g from the projected accelerometer measurements in the body frame a b. Next, by double integrating the linear acceleration a g, the speed v g of the wearable sensor and the distance of travel d g can be calculated.
Wherein superscripts g and b represent reference frames relative to a global reference frame and a body reference frame, respectively; the subscript 0 denotes an initial value.
The following are some other examples of spatiotemporal gait parameters:
The tilt angle of the body part may be calculated using a complementary filter that combines tilt angle measurements from the accelerometer and gyroscope. The complementary filter can compensate for low frequency drift of the gyroscope and high frequency noise of the accelerometer, providing a more reliable tilt angle output. Kinematic gait parameters like joint angles can be obtained by subtracting the inclination angles of adjacent body parts.
Wherein,AndThe tilt angles measured by the accelerometer and gyroscope, respectively, Δt is the time interval between the current (n) and the last (n-1) measurement, α is the control weight parameter of the complementary filter, which ranges between [0,1], and the recommended α value in this embodiment is 0.9.
The following are examples of some of the athletic gait parameters:
A typical knee angle calculated using this method is shown in fig. 7, and is suitable for running. As shown in FIG. 8, suitable for jogging and showing the angles of both the left and right legs, the joint angles may be measured uniaxially in the sagittal plane, or multiaxially in a three-dimensional configuration, for a particular application.
And calculating according to the accelerometer signals and the gyroscope signals contained in the gait motion data to obtain the kinematic parameters. Specifically, the step of calculating the kinematic parameters according to the accelerometer signals and the gyroscope signals contained in the gait motion data includes:
the inclination angle of the lower limb body part is calculated by processing the accelerometer signal and the gyroscope signal using a complementary filter or a kalman filter, and the kinematic parameter is calculated according to the difference of the calculated inclination angles between the adjacent lower limb body parts.
The method disclosed by the embodiment can be suitable for running test at any pace, and the sensor used in the embodiment is convenient to carry and realize. Moreover, the detection process of gait events can be integrated with a machine learning algorithm to detect time instances and automatically adjust the adaptive or normalized threshold.
On the basis of the gait detection method, the embodiment also discloses a gait recognition and fatigue detection method based on gait detection information, as shown in fig. 10, wherein the method comprises the following steps:
And step H1, respectively obtaining gait detection information of the two legs by using the gait detection method.
And step H2, calculating to obtain a symmetry index according to gait detection information corresponding to the two legs, and determining a gait recognition result according to the symmetry index.
After the gait detection information of the two legs is obtained according to the gait detection method, the gait stability and efficiency during running or walking can be evaluated according to the gait detection information. In particular, limb symmetry between the left and right legs (or healthy and affected side) is an indicator of physiological health and effective movement. Limb symmetry between the left and right legs is generally characterized using a symmetry index.
The Symmetry Index (SI) is expressed as a ratio of the contralateral limb, or expressed as a percentage value, expressed as:
wherein, X L and X R are gait parameters measured by the left and right limbs, respectively.
And step H3, respectively acquiring the separated EMG data corresponding to each gait cycle by using an EMG sensor.
An important gait feature involved in gait analysis is the monitoring of muscle fatigue during long distance walking or running analysis. Researchers can embed EMG electrodes and amplifiers directly above the leg muscles, collect data of the leg muscles through the EMG electrodes and amplifiers, to study the leg muscle activity in the asynchronous phase, and monitor the long-term effects of muscle fatigue over a period or number of steps.
And step H4, determining a fatigue detection result of the muscle according to the acquired EMG data and/or a power spectrum corresponding to the EMG data.
The raw EMG data for typical four lower limb muscles (quadriceps femoris, popliteal, anterior tibial and calf muscles) are measured by sensors worn while the user is walking, as shown in fig. 9.
To monitor muscle strength and fatigue during running or walking, several useful features may be extracted from raw EMG data over a limited period of time or as a function of time. These features include RMS values from the time domain, and MNF and MDF from the frequency domain.
In the time domain, the amplitude of the EMG signal can be used to estimate the level of muscle contraction. Common EMG signal envelope processing includes signal rectification and low pass filtering. RMS is the square root of the "arithmetic mean of the squares of a set of EMG readings in a moving window". The higher the RMS value, the greater the level of muscle contraction.
Further, the step of determining the fatigue detection result of the muscle according to the collected EMG data and/or the power spectrum corresponding to the EMG data includes:
determining and obtaining an RMS value according to EMG data; wherein the RMS value is the square root of the arithmetic mean of the squares of a set of EMG data in a moving window;
converting the EMG data in the time domain into a frequency domain to obtain a power spectrum of the EMG;
Calculating according to the power spectrum of the EMG to obtain an MNF value and an MDF value; wherein the MNF value is an average frequency calculated by dividing the sum of products of the power spectrum and the frequency of the EMG by the sum of the power spectrum; the MDF value is the frequency for dividing the EMG power spectrum into two constant-amplitude areas, so as to obtain half of the total power; and determining a fatigue detection result of the muscle according to one or more of the RMS value, the MNF value and the MDF value.
When monitoring the muscle activity of the lower limb by sEMG with a linear envelope, rectification and low-pass filtering of the electromyographic signals are typically involved. Envelope estimation of the electromyographic signal may be achieved by calculating the RMS value of the original electromyographic signal, by moving a window. The level of muscle fatigue may be defined as any decrease in the maximum muscle's ability to develop strength due to exercise. The increase in muscle strength is typically positively correlated with the amplitude and frequency of the electromyographic signals; in the case of muscle fatigue, however, the power spectrum of the electromyographic signal shifts to a lower frequency range. Thus, muscle fatigue can be detected when the MNF or MDF of the electromyographic signal power spectrum decreases, but the RMS amplitude of the electromyographic signal remains unchanged.
Specifically, the RMS value is the square root of the arithmetic mean of the squares of a set of EMG data in a moving window, and the corresponding calculation formulas are respectively:
in the frequency domain, the power spectrum of the EMG can be used to assess the extent of muscle fatigue. To convert an EMG signal in the time domain to the frequency domain, a fourier transform of the autocorrelation function of the EMG signal may be used to provide a Power Spectral Density (PSD). The MNF is then the average frequency calculated as the sum of the product of the EMG power spectrum and the frequency divided by the sum of the power spectrums. MDF is the frequency at which the EMG power spectrum is divided into two equal-amplitude regions, at half the total power. The corresponding calculation formulas of MNF and MDF are respectively as follows:
MNF and MDF can be used to assess muscle fatigue, as muscle fatigue can cause the frequency spectrum of the EMG signal to shift downward. In such embodiments, the sensor system may utilize the detected gait events for gait segmentation and then may perform muscle fatigue detection per gait phase for more accurate analysis.
In the gait detection method disclosed in this embodiment, in the detection apparatus using the wearable sensor integrating the inertial sensor (IMU) and the surface electromyography sensor (sEMG), the inertial sensor is used to collect gait motion data of the exercising person during walking or running and the electromyography sensor is used to collect muscle state data, and then the collected gait motion data and muscle state data are analyzed and processed, so as to realize the recognition of the motion gesture and the evaluation of the muscle fatigue degree. The method provided by this embodiment can not only identify two typical gait phases (i.e., stance and Swing phases) and HFPS gait events, namely heel strike (HEELSTRIKE), foot Flat (Foot Flat), push-off and Swing (Swing), helping to analyze gait. The divided gait events can be applied to the sEMG signals for step muscle activity monitoring or as a controller for a walking operation device.
In a third aspect, the present embodiment further provides a gait detection apparatus, as shown in fig. 11, including: an inertial sensor 101, an electromyogram sensor 102, a processor 103 communicatively coupled to both the inertial sensor 101 and the electromyogram sensor 102, and a memory 104.
The gait detection apparatus provided in this embodiment is portable, wireless (using bluetooth wireless connection to a personal smart phone or tablet), compact, and a sensor system capable of measuring leg movements and surface EMG simultaneously would be an ideal choice for gait analysis in daily activities and outdoor environments. Inertial Sensors (IMUs) are lightweight, miniature, portable, low cost motion tracking measurement devices. A robust, real-time gait event detection method using an IMU can be used to identify gait phases for step-wise gait analysis. The sensor system with wireless data communication, data memory and enough battery capacity can integrate a plurality of motion sensors to track the direction and the relative position of the body part, and can meet the requirements of gait analysis in daily activities.
The inertial sensor 101 and the electromyography sensor 102 are fixed to the skin surface for acquiring gait motion data and transmitting the gait motion data to the processor 103.
The processor 103 is compiled to implement the steps of the gait detection method when executing the gait detection program stored in the memory 104.
The steps of the gait detection method realized by the processor when executing the gait detection program comprise:
Acquiring gait motion data;
identifying a gait event from the acquired gait motion data;
performing gait cycle segmentation on the identified gait event;
gait detection information is extracted from each of the divided gait cycles.
And identifying the local extremum of the angular velocity and the movement direction change from the gait movement data, and identifying the gait event according to the time point and the sequence of occurrence of the local extremum and the movement direction change time point of the angular velocity.
Determining a swing stage in a gait event according to a time point when the first local extremum occurs in the angular velocity;
determining a heel strike stage in a gait event according to a time point corresponding to the change of the movement direction after the first local extremum of the angular velocity;
Determining the sole flat-laying stage in a gait event according to the time point when the second local extremum of the angular velocity appears after the movement direction changes;
and determining the foot pushing stage in the gait event according to the time point when the third local extremum of the angular velocity appears after the second local extremum.
Further, the local extremum is represented as a maximum or minimum point having an absolute amplitude above a fixed, adaptive or normalized threshold;
The second and third local extremum are represented as the maximum or minimum points occurring within an adaptation period, or points occurring at the end of the adaptation period;
the adaptive period of the second local extremum begins at the moment of the heel strike phase and continues to be proportional to the time of the swing phase of the previous step.
The adaptation period of the third local extremum starts from the moment of the sole lay-down phase up to the change of the direction of movement of the angular velocity.
Further, the step of identifying a gait event from the acquired gait motion data specifically includes: and calculating and processing the gait motion data by using a finite state machine, and determining the gait stage of each frame in the motion process corresponding to the gait motion data.
Further, the gait detection information includes: space-time parameters and kinematic parameters;
the step of extracting gait detection information from each of the divided gait cycles includes:
Calculating space-time parameters according to corresponding time points, stride time and stride length between gait phases in the gait cycle;
and calculating according to the accelerometer signals and the gyroscope signals contained in the gait motion data to obtain the kinematic parameters.
Further, the step of calculating the space-time parameter according to the corresponding time point, the stride time and the stride length between the gait phases in the gait cycle includes:
Calculating to obtain a time gait parameter according to corresponding time points among gait phases in a gait event;
The spatial gait parameters are calculated from acceleration signals measured between foot rest gait events in two consecutive gaits.
Further, the step of calculating the kinematic parameters according to the accelerometer signals and the gyroscope signals included in the gait motion data includes:
the inclination angle of the lower limb body part is calculated by processing the accelerometer signal and the gyroscope signal using a complementary filter or a kalman filter, and the kinematic parameter is calculated according to the difference of the calculated inclination angles between the adjacent lower limb body parts.
Further, the present embodiment also discloses a computer storage medium, where the computer readable storage medium stores one or more programs, and the one or more programs are executable by one or more processors to implement the steps of the gait detection method.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, "N" means at least two, for example, two, three, etc., unless specifically defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and additional implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order from that shown or discussed, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can read instructions from and execute instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or N wires, a portable computer cartridge (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium may even be paper or other suitable medium upon which the program is printed, as the program may be electronically captured, via optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the N steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or part of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, and the program may be stored in a computer readable storage medium, where the program when executed includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented as software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like. While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.
Claims (12)
1. A gait detection method, comprising:
Acquiring gait motion data;
identifying a gait event from the acquired gait motion data;
performing gait cycle segmentation on the identified gait event;
gait detection information is extracted from each of the divided gait cycles.
2. The gait detection method according to claim 1, wherein the step of identifying a gait event from the acquired gait motion data comprises:
The local extremum and the movement direction change of the angular velocity are identified from the gait movement data, and the gait event is identified according to the time point and the sequence of occurrence of the local extremum and the movement direction change time point of the angular velocity.
3. The gait detection method of claim 2, wherein the gait event comprises a plurality of gait phases; the gait phase is: a heel strike stage, a sole flat stage, a foot pushing stage and a swing stage;
The step of identifying the gait event according to the time point of occurrence of the local extremum, the sequence of occurrence and the time point of change of the movement direction of the angular velocity comprises the following steps:
Determining a swing stage in a gait event according to a time point when the first local extremum occurs in the angular velocity;
determining a heel strike stage in a gait event according to a time point corresponding to the change of the movement direction after the first local extremum of the angular velocity;
Determining the sole flat-laying stage in a gait event according to the time point when the second local extremum of the angular velocity appears after the movement direction changes;
and determining the foot pushing stage in the gait event according to the time point when the third local extremum of the angular velocity appears after the second local extremum.
4. A gait detection method according to claim 3, wherein the local extremum is represented as a maximum or minimum point having an absolute amplitude above a fixed, adaptive or normalized threshold;
The second and third local extremum are represented as the maximum or minimum points occurring within an adaptation period, or points occurring at the end of the adaptation period;
the adaptive period of the second local extremum begins at the moment of the heel strike phase and continues to be proportional to the time of the swing phase of the previous step;
The adaptation period of the third local extremum starts from the moment of the sole lay-down phase up to the change of the direction of movement of the angular velocity.
5. The gait detection method according to claim 2, wherein the step of identifying a gait event from the acquired gait motion data specifically comprises: and calculating and processing the gait motion data by using a finite state machine, and determining the gait stage of each frame in the motion process corresponding to the gait motion data.
6. The gait detection method according to claim 1, wherein the gait detection information includes: space-time parameters and kinematic parameters;
the step of extracting gait detection information from each of the divided gait cycles includes:
Calculating space-time parameters according to corresponding time points, stride time and stride length between gait phases in the gait cycle;
and calculating according to the accelerometer signals and the gyroscope signals contained in the gait motion data to obtain the kinematic parameters.
7. The gait detection method according to claim 6, wherein the step of calculating the spatiotemporal parameter from the corresponding time points between the gait phases in the gait cycle, the stride time, and the stride length comprises:
Calculating to obtain a time gait parameter according to corresponding time points among gait phases in a gait event;
The spatial gait parameters are calculated from acceleration signals measured between foot rest gait events in two consecutive gaits.
8. The gait detection method according to claim 6, wherein the step of calculating the kinematic parameters from the accelerometer signal and the gyroscope signal included in the gait motion data comprises:
the inclination angle of the lower limb body part is calculated by processing the accelerometer signal and the gyroscope signal using a complementary filter or a kalman filter, and the kinematic parameter is calculated according to the difference of the calculated inclination angles between the adjacent lower limb body parts.
9. A gait recognition and fatigue detection method based on gait detection information, comprising:
obtaining gait detection information of the two legs respectively by using the gait detection method as claimed in any one of claims 1 to 8;
calculating to obtain a symmetry index according to gait detection information corresponding to the two legs, and determining a gait recognition result according to the symmetry index;
Respectively acquiring EMG data corresponding to each divided gait cycle by using an EMG sensor;
And determining a fatigue detection result of the muscle according to the acquired EMG data and/or a power spectrum corresponding to the EMG data.
10. The gait recognition and fatigue detection method based on gait detection information according to claim 9, wherein the step of determining the fatigue detection result of the muscle from the acquired EMG data and/or the power spectrum corresponding to the EMG data comprises:
determining and obtaining an RMS value according to EMG data; wherein the RMS value is the square root of the arithmetic mean of the squares of a set of EMG data in a moving window;
converting the EMG data in the time domain into a frequency domain to obtain a power spectrum of the EMG;
Calculating according to the power spectrum of the EMG to obtain an MNF value and an MDF value; wherein the MNF value is the average frequency calculated by dividing the sum of the products of the EMG power spectrum and the frequency by the sum of the power spectrums; the MDF value is the frequency for dividing the EMG power spectrum into two constant-amplitude areas, so as to obtain half of the total power;
And determining a fatigue detection result of the muscle according to one or more of the RMS value, the MNF value and the MDF value.
11. A gait detection apparatus, comprising: the system comprises an inertial sensor, an electromyography sensor, a processor and a memory, wherein the processor is in communication connection with both the inertial sensor and the electromyography sensor;
The inertial sensor and the electromyography sensor are fixed on the surface of the skin and are used for acquiring gait motion data and sending the gait motion data to the processor;
the processor being compiled to implement the steps of the gait detection method according to any of claims 1-8 when executing a gait detection program stored in the memory.
12. A computer storage medium, characterized in that the computer-readable storage medium stores one or more programs, the one or more programs are executable by the one or more processors to implement the steps of the gait detection method as recited in any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410547431.3A CN118452897A (en) | 2024-05-06 | 2024-05-06 | Gait detection, motion recognition and fatigue detection methods, devices and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410547431.3A CN118452897A (en) | 2024-05-06 | 2024-05-06 | Gait detection, motion recognition and fatigue detection methods, devices and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118452897A true CN118452897A (en) | 2024-08-09 |
Family
ID=92170089
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410547431.3A Pending CN118452897A (en) | 2024-05-06 | 2024-05-06 | Gait detection, motion recognition and fatigue detection methods, devices and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118452897A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN119679399A (en) * | 2025-02-21 | 2025-03-25 | 南昌大学第一附属医院 | Patient gait analysis method and system based on motion detection data |
-
2024
- 2024-05-06 CN CN202410547431.3A patent/CN118452897A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN119679399A (en) * | 2025-02-21 | 2025-03-25 | 南昌大学第一附属医院 | Patient gait analysis method and system based on motion detection data |
CN119679399B (en) * | 2025-02-21 | 2025-05-20 | 南昌大学第一附属医院 | Patient gait analysis method and system based on motion detection data |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Gujarathi et al. | Gait analysis using imu sensor | |
Zhang et al. | Accurate ambulatory gait analysis in walking and running using machine learning models | |
Sabatini et al. | Assessment of walking features from foot inertial sensing | |
Li et al. | Walking speed and slope estimation using shank-mounted inertial measurement units | |
Aminian et al. | Spatio-temporal parameters of gait measured by an ambulatory system using miniature gyroscopes | |
US8109890B2 (en) | Body movement monitoring device | |
JP5586050B2 (en) | Gait analysis system | |
US8821417B2 (en) | Method of monitoring human body movement | |
Lee et al. | Portable activity monitoring system for temporal parameters of gait cycles | |
US20130123665A1 (en) | System and method for 3d gait assessment | |
US10524699B2 (en) | Method and system for monitoring terrain and gait and predicting upcoming terrain | |
CN108577854A (en) | Gait recognition method and gait ancillary equipment | |
KR20190014641A (en) | System and method for Gait analysis | |
CN110420029A (en) | A kind of walking step state wireless detecting system based on Multi-sensor Fusion | |
Bötzel et al. | Quantification of gait parameters with inertial sensors and inverse kinematics | |
JP2014208257A (en) | Gait analysis system | |
CN118975793A (en) | Gait analysis system and gait analysis method | |
CN118452897A (en) | Gait detection, motion recognition and fatigue detection methods, devices and storage medium | |
Borghetti et al. | Multisensor system for analyzing the thigh movement during walking | |
Bartoszek et al. | Comparison of the optoelectronic BTS Smart system and IMU-based MyoMotion system for the assessment of gait variables | |
Bishop et al. | Walking speed estimation using shank-mounted accelerometers | |
CN110693501A (en) | Wireless walking gait detection system based on multi-sensor fusion | |
Yang et al. | Ambulatory walking speed estimation under different step lengths and frequencies | |
Muhamad et al. | Design and implementation of wearable IMU sensor system for heel-strike and toe-off gait parameter measurement | |
CN112839569B (en) | Method and system for assessing human movement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |