CN113197571B - Gait training evaluation method and device based on radar - Google Patents
Gait training evaluation method and device based on radar Download PDFInfo
- Publication number
- CN113197571B CN113197571B CN202110497172.4A CN202110497172A CN113197571B CN 113197571 B CN113197571 B CN 113197571B CN 202110497172 A CN202110497172 A CN 202110497172A CN 113197571 B CN113197571 B CN 113197571B
- Authority
- CN
- China
- Prior art keywords
- training
- gait
- foot
- radar
- virtual object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000005021 gait Effects 0.000 title claims abstract description 224
- 238000011156 evaluation Methods 0.000 title claims abstract description 143
- 238000001514 detection method Methods 0.000 claims abstract description 22
- 238000004364 calculation method Methods 0.000 claims description 68
- 230000003993 interaction Effects 0.000 claims description 17
- 238000000034 method Methods 0.000 claims description 17
- 230000002093 peripheral effect Effects 0.000 claims description 7
- 230000002452 interceptive effect Effects 0.000 abstract description 2
- 230000000694 effects Effects 0.000 description 4
- 238000003745 diagnosis Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 208000012661 Dyskinesia Diseases 0.000 description 1
- 208000007101 Muscle Cramp Diseases 0.000 description 1
- 208000002740 Muscle Rigidity Diseases 0.000 description 1
- 208000010428 Muscle Weakness Diseases 0.000 description 1
- 206010028372 Muscular weakness Diseases 0.000 description 1
- 208000005392 Spasm Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 208000029028 brain injury Diseases 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000008034 disappearance Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 210000003141 lower extremity Anatomy 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 208000020431 spinal cord injury Diseases 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B22/00—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
- A63B22/02—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with movable endless bands, e.g. treadmills
- A63B22/0235—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with movable endless bands, e.g. treadmills driven by a motor
- A63B22/0242—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with movable endless bands, e.g. treadmills driven by a motor with speed variation
- A63B22/025—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with movable endless bands, e.g. treadmills driven by a motor with speed variation electrically, e.g. D.C. motors with variable speed control
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B23/00—Exercising apparatus specially adapted for particular parts of the body
- A63B23/035—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
- A63B23/04—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs
- A63B23/0405—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs involving a bending of the knee and hip joints simultaneously
- A63B23/0458—Step exercisers without moving parts
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B23/00—Exercising apparatus specially adapted for particular parts of the body
- A63B23/035—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
- A63B23/04—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs
- A63B23/0405—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs involving a bending of the knee and hip joints simultaneously
- A63B23/0464—Walk exercisers without moving parts
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B71/0622—Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B71/0622—Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
- A63B2071/0638—Displaying moving images of recorded environment, e.g. virtual environment
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physical Education & Sports Medicine (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Rehabilitation Therapy (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Cardiology (AREA)
- Vascular Medicine (AREA)
- Geometry (AREA)
- Rehabilitation Tools (AREA)
Abstract
The invention discloses a gait training evaluation method based on radar, which comprises the following steps: s00: setting up an operation environment of a training evaluation system, and S10: projecting the planar virtual object through a projector, S20: acquiring foot detection data of a gait trainer through a radar and recording acquisition time, and S30: judging and calculating the position relation between the actual stepping foot of the gait trainer and the plane virtual object according to different training modes in the training evaluation system and generating real-time associated data; s40: and outputting an evaluation result of the gait trainer. According to the radar-based gait training evaluation method, the position of the actual stepping foot of the gait trainer can be obtained through the radar, whether the coordinate of the actual stepping foot is related to the position of the virtual projection object or not is further determined, meanwhile, various interactive gait training schemes are supported, the interestingness and the richness of the gait training of the gait trainer are further improved, and accurate gait rehabilitation training is pertinently implemented.
Description
Technical Field
The invention relates to the field of training evaluation methods, in particular to a radar-based gait training evaluation method and a radar-based gait training evaluation device.
Background
Clinically, problems such as muscle spasm, muscle weakness, joint rigidity, brain injury and spinal cord injury are all easy to cause walking dyskinesia of patients. Walking training is an important means to assist the recovery of walking function in patients with the above-mentioned diseases. At present, the walking rehabilitation training measures of hospitals are mainly that family members or therapists support patients for walking training, the training method not only needs the condition of manual feedback walking training, but also can lead to limited movement of the patients due to the support, and the richness of gait is also reduced, so that the hospitals gradually adopt gait training equipment to assist the patients to carry out walking training, and the purpose of quickly recovering the walking functions of the patients is achieved.
Chinese patent document CN111603171A relates to a gait parameter determination method and system for lower limb rehabilitation. The method comprises the following steps: collecting original data of a laser radar installed on a rehabilitation robot; determining the time and the position of the patient from the ground and the ground according to the original data; based on the time and position, patient gait parameters are calculated, including step size, step width, swing time, stance time and support time. The patent discloses only acquisition of gait parameters of a patient by a lidar, which will not solve the problem of accurately detecting the patient by the radar in connection with virtual projection.
Disclosure of Invention
In order to overcome the defects of the prior art, the technical problem to be solved by the invention is to provide a radar-based gait training evaluation method, which can acquire the correlation between the actual stepping foot and the virtual projection of a gait trainer through a radar, and simultaneously support various interactive gait training schemes, so as to further improve the interestingness and richness of the gait training of the gait trainer and pertinently implement accurate gait rehabilitation training.
To achieve the purpose, the invention adopts the following technical scheme:
the invention provides a radar-based gait training evaluation method, which is implemented according to the following steps:
S00: setting up an operation environment of a training evaluation system; in step S00, the training evaluation system is electrically connected to a radar gait rehabilitation training platform, the radar gait rehabilitation training platform is controlled by the training evaluation system, the operation environment setting-up step includes that after the training evaluation system is started, a gait trainer selects one of a long-distance linear walking training mode or a training pool stepping training mode through a man-machine interaction module of the training evaluation system, when the training evaluation system is configured into the long-distance linear walking training mode, the radar gait rehabilitation training platform is configured as a running table conveyor belt for movement, and when the training evaluation system is configured into the training pool stepping training mode, the radar gait rehabilitation training platform is configured as a running table conveyor belt or a flat ground.
S10: after a gait trainer selects a training mode, a projector projects the plane virtual object on the radar gait rehabilitation training platform, and preset coordinate data of the plane virtual object is stored in a storage module of the training evaluation system; in step S10, the projection range of the planar virtual object is configured to be within a preset range with the standing position of the gait trainer as a central coordinate, the long-distance straight walking training mode and the training pool stepping training mode both have different difficulty levels, and the training evaluation system adjusts the difficulty levels of the different training modes by adjusting the distance between the planar virtual object and the gait trainer, the moving speed of the planar virtual object and the throwing quantity of the planar virtual object.
S20: acquiring foot detection data of the moment that a gait trainer steps on the radar gait rehabilitation training platform through a radar and recording acquisition time; the two-dimensional point set corresponding to the foot detection data is denoted as { P ij, (i=0, 1,2, …, n) }, and the acquisition time is denoted as t j.
S30: the calculation module in the training evaluation system calculates the actual stepping foot outline and the actual stepping foot center point of the gait trainer according to the foot detection data and generates real-time calculation data, and under different training modes, the training evaluation system judges and calculates the position relation between the real-time calculation data and the virtual object outline and the virtual object standard center point of the plane virtual object and generates real-time association data.
S40: and an evaluation module in the training evaluation system inputs the real-time calculation data and the real-time association data into a gait evaluation algorithm to output an evaluation result of a gait trainer after calculation.
In the step S30, the calculation step of the center point of the actual stepping foot is as follows:
S301: acquiring an edge point set of the outline of the actual stepping foot according to the two-dimensional point set; wherein T ij is a subset of P ij, and the set of edge points of the actual stepping foot profile is denoted as { T ij, (i=0, 1,2, … n) };
S302: the computing module fits and simplifies the edge point set of the outline of the actual stepping foot, and generates a simplified actual stepping foot polygon and a simplified actual stepping foot vertex set { Q ij, (i=0, 1,2, … n) };
s303: calculating footprint area according to the simplified actual stepping foot vertex set; wherein, the simplified actual stepping foot vertex set comprises n vertices, and the calculation formula of the actual stepping foot area A is as follows:
Where x i is the x-coordinate of the i-th point in Q ij, y i+1 is the y-coordinate of the i+1th point in Q ij, x i+1 is the x-coordinate of the i+1th point in Q ij, and y i is the y-coordinate of the i-th point in Q ij;
s304: calculating the center point (C x,Cy) of the actual stepping foot; the calculation formula of the center point of the actual stepping foot is as follows:
Where x i is the x-coordinate of the i-th point in Q ij, y i+1 is the y-coordinate of the i+1th point in Q ij, x i+1 is the x-coordinate of the i+1th point in Q ij, and y i is the y-coordinate of the i-th point in Q ij;
In step S301, the preferred technical solution of the present invention is that, according to the two-dimensional point set, the edge point set { T ij, (i=0, 1,2, … n) } of the actual stepping foot profile is obtained by the following discrimination formula:
Where j represents the acquisition at the jth acquisition or at time t j, Representing the x-coordinate of the i-1 th point in the j-th acquisition neutron set T ij,Representing the y-coordinate of the i-1 th point in the j-th acquisition neutron set T ij,Representing the x-coordinate of the ith point in the jth acquisition neutron set T ij,Representing the y-coordinate of the ith point in the jth acquisition neutron set T ij,Representing the x-coordinate of the ith point in the jth acquisition neutron set P ij,Representing the y-coordinate of the ith point in the jth acquisition neutron set P ij.
In the step S30, the calculation formula of the step distance is as follows:
Fd=|xc1-xc2|
Wherein X c1 represents the coordinate of one foot of the gait trainer in the X direction of the radar gait rehabilitation training platform in a certain acquisition, and X c2 represents the coordinate of the other foot of the gait trainer in the X direction of the radar gait rehabilitation training platform in a certain acquisition;
in step S30, the calculation formula of the step size is as follows:
Fl=|yc1-(vjΔt+yc2)|
Wherein Y c1 represents the coordinate of one foot of a gait trainer in the Y direction of the radar gait rehabilitation training platform in a certain acquisition, Y c2 represents the coordinate of the other foot of the gait trainer in the Y direction of the radar gait rehabilitation training platform in a certain acquisition, the moving speed of the radar gait rehabilitation training platform is set to be V j, and the acquisition time is recorded as the value range of t j,tj and is 0,1,2 and … m.
In the long-distance straight walking training mode, the plane virtual object is configured as a standard footprint, and the position relationship between the actual stepping foot of the gait trainer and the plane virtual object is configured as whether to step on the standard footprint and the superposition ratio of the standard footprint;
the calculation steps of the real-time associated data are as follows:
s305: presetting the virtual object standard center point in a certain acquisition; wherein the virtual object standard center point is denoted as C S(xcs,ycs);
S306: comparing the relation between the distance between the actual stepping foot center point (C x,Cy) of the actual stepping foot and the virtual object standard center point C S(xcs,ycs in a certain collection and the threshold value d m, and if the distance between the actual stepping foot center point (C x,Cy) and the virtual object standard center point C S(xcs,ycs is smaller than the threshold value d m, marking the pedal as a standard footprint in the pedal; if the distance between the center point (C x,Cy) of the actual stepping foot and the virtual object standard center point C S(xcs,ycs is not less than the threshold value d m, the stepping is not considered; wherein, the comparison formula is as follows:
S307: calculating the coincidence rate P S of the standard footprint in the pedaling process; wherein, the calculation formula of the coincidence rate P S is as follows:
in the training pool stepping training mode, the position relationship between the actual stepping foot of the gait trainer and the plane virtual object is configured as to whether the left foot or the right foot of the gait trainer can avoid the plane virtual object moving or whether the position relationship between the actual stepping foot of the gait trainer and the plane virtual object is configured as to whether the plane virtual object can be stepped, and the calculation steps of the real-time associated data are as follows:
S305: presetting the virtual object standard center point in a certain acquisition; wherein the virtual object standard center point is denoted as C O(xco,yco);
S306: comparing the relation between the distance between the actual stepping foot center point (C x,Cy) of the actual stepping foot and the virtual object standard center point C O(xco,yco in a certain collection and the threshold value d mo, and if the distance between the actual stepping foot center point (C x,Cy) and the virtual object standard center point is marked as C O(xco,yco) is more than the threshold value d mo, the user is considered to be successfully avoided or not stepped on; if the distance between the center point (C x,Cy) of the actual stepping foot and the virtual object standard center point C O(xco,yco is less than or equal to the threshold value d mo, the stepping is not successfully avoided or performed; wherein, the comparison formula is as follows:
The preferable technical scheme of the invention is that the radar gait rehabilitation training platform is configured as a stationary running platform conveyor belt or a flat ground. When the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the left foot or the right foot of the gait trainer can avoid the moving plane virtual object, a training area is arranged on the motionless treadmill conveyor belt or the flat ground, and the projector projects a guide station footprint on the training area; and/or when the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the plane virtual object can be stepped on, a training area and a peripheral area are arranged on the motionless running platform conveyor belt or the plane, the training area is positioned in the middle of the peripheral area, and the projector projects a guide station footprint in the training area.
In the step S40, in the long-distance straight-line walking training mode, the straight-line walking training score calculation process in the gait evaluation algorithm is implemented according to the following steps:
S401: presetting a standard step length value, a standard step distance value and a full value; wherein the preset standard step length value is marked as F 1S, the preset standard step length value is marked as F ws, and the full score value is marked as 100 minutes;
s402: a complete walking training is implemented in a long-distance straight walking training mode, a storage module in the training evaluation system records the coincidence rate P S of standard marks in each stepping of the gait trainer in the straight walking training, the step length F 1 and the step length F w, and a calculation module in the training evaluation system calculates the average value of the standard marks in each stepping Average value of step lengthAverage of stride
S403: according to the average value of standard footprint in the pedalAverage of the stepsAverage of stepsThe gait training person gait Score is calculated as follows:
In the step S40, in the step training mode, the step training score calculation process in the gait evaluation algorithm is implemented according to the following steps:
S404: presetting weighting coefficients d n of n different difficulty levels;
S405: implementing a complete stepping training under a training pool stepping training mode, recording different difficulty levels, and recording the total evasion success number or total stepping number C n corresponding to the different difficulty levels in the complete stepping training by the gait trainer in the stepping training by a storage module in the training evaluation system;
Score=d1C1+d2C2+…+dnCn
the invention also provides a radar-based gait training evaluation device for the radar-based gait training evaluation method, which comprises the following steps:
Radar gait rehabilitation training platform; for rehabilitation training of the gait trainer;
a radar; the foot detection data acquisition module is used for acquiring foot detection data of a gait trainer;
a projector; for projecting the planar virtual object on the radar gait rehabilitation training platform;
The training evaluation system is used for guiding the gait trainer to carry out rehabilitation training under different rehabilitation modes and comprises a storage module, a calculation module, an evaluation module and a man-machine interaction module, wherein the storage module is used for storing preset coordinate data of the plane virtual object, the calculation module is used for generating real-time calculation data and real-time association data, the evaluation module is used for calculating an evaluation result according to the output of the calculation module, and the man-machine interaction module is used for inputting control instructions and displaying the evaluation result.
The beneficial effects of the invention are as follows:
According to the radar-based gait training evaluation method provided by the invention, the radar is used for acquiring foot detection data of a gait trainer, calculating data such as an actual stepping foot center point, a stepping distance, a stepping length and the like based on the foot detection data, and then calculating the position relation between the data and virtual objects in different training modes and generating real-time associated data, so that the association between the radar-acquired patient foot print data and virtual projection data is obtained, and further, accurate calculation of data such as stepping rate or obstacle avoidance success probability in stepping or waiting games is achieved, and accurate data basis is provided for the evaluation result of the gait trainer. Meanwhile, the gait training evaluation method can support game interaction under various training modes, and the richness of the gait of the patient is greatly improved.
Drawings
FIG. 1 is a flow chart of a radar-based gait training evaluation method provided in an embodiment of the invention;
FIG. 2 is a schematic illustration of a footprint provided in an embodiment of the present invention;
FIG. 3 is a schematic diagram of a training area provided in a second embodiment of the present invention;
FIG. 4 is a schematic illustration of a training area provided in a third embodiment of the present invention;
fig. 5 is a functional block diagram of a radar-based gait training evaluation apparatus provided in an embodiment of the invention.
In the figure:
1. Gait rehabilitation training platform; 2. a radar; 3. a projector; 4. training an evaluation system; 41. a storage module; 42. a computing module; 43. an evaluation module; 44. and a man-machine interaction module.
Detailed Description
The technical scheme of the invention is further described below by the specific embodiments with reference to the accompanying drawings.
Example 1
As shown in fig. 1, the radar-based gait training evaluation method provided in this embodiment is used for collecting gait rehabilitation data of a patient by using a radar 2 and evaluating the gait rehabilitation data, in this embodiment, when a training evaluation system 4 is in a long-distance straight walking training mode, the radar 2 gait rehabilitation training platform 1 is configured as a running table conveyor belt for movement, in the long-distance straight walking training mode, the patient can play a walking game on the running table conveyor belt for movement, and a planar virtual object is configured as a planar footprint, and the method is implemented according to the following steps:
Step S00: the running environment of the training evaluation system 4 is built. In step S00, the training evaluation system 4 is electrically connected to the radar 2 gait rehabilitation training platform 1, the radar 2 gait rehabilitation training platform 1 is controlled by the training evaluation system 4, the operating environment setting up step includes that after the training evaluation system 4 is started, a gait trainer selects one of a long-distance straight walking training mode or a training pool stepping training mode through a man-machine interaction module 44 of the training evaluation system 4, when the training evaluation system 4 is configured into the long-distance straight walking training mode, the radar 2 gait rehabilitation training platform 1 is configured as a running table conveyor belt for movement, and when the training evaluation system 4 is configured into the training pool stepping training mode, the radar 2 gait rehabilitation training platform 1 is configured as a running table conveyor belt or a flat ground. The gait training evaluation method based on the radar is realized by a gait training evaluation device based on the radar, the gait training evaluation device comprises a radar 2 gait rehabilitation training platform 1, the radar 2, a projector 3 and a training evaluation system 4, the radar 2 is arranged right in front of the rehabilitation training platform, the detection area of the radar 2 is the upper surface of the rehabilitation training platform and is very close to the upper surface of the rehabilitation training platform, the projector 3 is positioned on one side of the rehabilitation training platform, and the projector 3 projects the projection of the radar 2 to the upper surface of the rehabilitation training platform. The driving module of the radar 2 gait rehabilitation training platform 1, the radar 2 and the projector 3 can perform data interaction with the training evaluation system 4, and are controlled by a computer on which the training evaluation system 4 is carried.
Step S10: projecting a plane virtual object on the radar 2 gait rehabilitation training platform 1 according to a step trainer selective training mode through the projector 3, and storing preset coordinate data of the plane virtual object into a storage module 41 of the training evaluation system 4; in step S10, the projection range of the planar virtual object is configured to have different difficulty levels in the preset range with the standing position of the gait trainer as the center coordinate, and the training evaluation system 4 adjusts the difficulty levels of different training modes by adjusting the distance between the planar virtual object and the gait trainer, the moving speed of the planar virtual object and the number of the planar virtual objects, and the difficulty levels of different training modes can also be combined with the diagnosis and treatment scheme to achieve the ideal treatment effect, for example: the diagnosis and treatment quality is gradually improved by arranging a first-easy last-difficult material arranging scheme. The computer is internally provided with a storage module 41, and the relative position of the projector 3 and the gait training evaluation device based on the radar is determined after the completion of the construction, so that when the plane virtual object is projected on the radar 2 gait training platform 1, the preset coordinate data of the plane virtual object can be preset, and can also be input through the man-machine interaction module 44 of the training evaluation system 4.
Step S20: acquiring foot detection data of a gait trainer through a radar 2 and recording acquisition time; the two-dimensional point set corresponding to the foot detection data is denoted as { P ij, (i=0, 1,2, …, n) }, and the acquisition time is denoted as t j, and since the scanning period of the radar 2 is usually fixed, j in t j may also represent the acquisition times, and knowing the acquisition times is equivalent to knowing the acquisition time t j in the case of fixed acquisition periods. When a patient performs walking training on the radar 2 gait rehabilitation training platform 1, after the scanning period of the radar 2 is reasonably set, the radar 2 can continuously scan whether the foot of a gait trainer exists on the upper surface of the rehabilitation training platform, and when the patient detects the foot of the gait trainer, the foot detection data of the gait trainer are acquired, and the acquisition time or the scanning frequency is recorded.
Step S30: the calculation module 42 in the training evaluation system 4 calculates the actual foot stepping outline and the actual foot stepping center point of the gait trainer according to the foot detection data and generates real-time calculation data, and judges and calculates the position relationship between the real-time calculation data and the virtual object outline and the virtual object standard center point of the planar virtual object according to different training modes in the training evaluation system 4 and generates real-time correlation data. Different real-time associated data are required in different training modes, because the estimated standards are inconsistent, the long-distance straight walking training mode needs to verify whether the actual stepping foot of the patient steps on the middle plane virtual object, so that the three data of the center point, the step distance and the step length of the actual stepping foot need to be calculated, and in the training pool stepping training mode, the calculation data of the virtual object outline of the plane virtual object are mainly needed to judge whether to step on the data such as the stepping touch rate.
In order to improve the calculation accuracy of the actual foot center point, in step S30, the calculation step of the actual foot center point is as follows:
Step S301: and acquiring an edge point set of the actual stepping foot outline according to the two-dimensional point set. Where T ij is a subset of P ij and the set of edge points of the actual stepping foot profile is denoted { T ij, (i=0, 1,2, … n) }. As shown in fig. 2, the set of edge points of the outline of the actual stepping foot can outline the footprint of the actual stepping foot of the patient, the outline of the actual stepping foot of the patient can be determined from the footprint, and the subset T ij can be obtained from the following discrimination formula.
Step S302: fitting and simplifying the edge point set of the actual stepping foot profile to generate a simplified actual stepping foot polygon and a simplified actual stepping foot vertex set { Q ij, (i=0, 1,2, … n) }. The specific method for obtaining the actual stepping foot vertex set { Q ij, (i=0, 1,2, … n) } comprises the following steps: and performing polygon fitting and simplification on the edges of the footprint by using a Douglas-Peucker algorithm, and generating a simplified actual stepping foot polygon according to fitting and simplification results so as to achieve the purpose of calculating the actual stepping foot area and the actual stepping foot center point.
Step S303: calculating footprint area from the reduced actual foot-stepping polygon and the reduced actual foot-stepping vertex set { Q ij, (i=0, 1,2, … n) }; the simplified actual stepping foot vertex set { Q ij, (i=0, 1,2, … n) } contains n vertices, and the calculation formula of the actual stepping foot area a is as follows:
Where x i is the x-coordinate of the i-th point in Q ij, y i+1 is the y-coordinate of the i+1th point in Q ij, x i+1 is the x-coordinate of the i+1th point in Q ij, and y i is the y-coordinate of the i-th point in Q ij.
Step S304: calculating the center point (C x,Cy) of the actual stepping foot; the calculation formula of the center point of the actual stepping foot is as follows:
Where x i is the x-coordinate of the i-th point in Q ij, y i+1 is the y-coordinate of the i+1th point in Q ij, x i+1 is the x-coordinate of the i+1th point in Q ij, and y i is the y-coordinate of the i-th point in Q ij.
In order to acquire the edge point set { Q ij, (i=0, 1,2, … n) } of the accurate actual stepping foot profile from the two-dimensional point set, it is further preferable that in step S301, a discrimination formula for acquiring the edge point set of the actual stepping foot profile from the two-dimensional point set { P ij, (i=0, 1,2, …, n) } is as follows:
Where j represents the acquisition at the jth acquisition or at time t j, Representing the x-coordinate of the i-1 th point in the j-th acquisition neutron set T ij,Representing the y-coordinate of the i-1 th point in the j-th acquisition neutron set T ij,Representing the x-coordinate of the ith point in the jth acquisition neutron set T ij,Representing the y-coordinate of the ith point in the jth acquisition neutron set T ij,Representing the x-coordinate of the ith point in the jth acquisition neutron set P ij, and y Pij represents the y-coordinate of the ith point in the jth acquisition neutron set P ij. That is, the subset of the two-dimensional point set P ij satisfying the above-mentioned discrimination formula is the edge point set { Q ij, (i=0, 1,2, … n) } of the actual stepping foot contour.
In order to calculate the step distance, further, the step distance related to the radar-based gait training evaluation method provided in the embodiment refers to the distance from one foot to the other foot in the X direction of the radar 2 gait rehabilitation training platform 1 in a certain acquisition, so in step S30, the step distance is calculated as follows:
Fd=|xc1-xc2|。
Wherein X c1 represents the coordinate of one foot of the gait trainer in the X direction of the radar 2 gait rehabilitation training platform 1 in a certain acquisition, namely the abscissa of one foot, and X c2 represents the coordinate of the other foot of the gait trainer in the X direction of the radar 2 gait rehabilitation training platform 1 in a certain acquisition, namely the abscissa of the other foot.
In order to calculate the step length, further, the step length related to the radar-based gait training evaluation method provided in the embodiment refers to the coordinate of one foot to the other foot in the Y direction of the radar 2 gait rehabilitation training platform 1 in a certain acquisition, and if the radar 2 gait rehabilitation training platform 1 is moving, the moving speed of the radar 2 gait rehabilitation training platform 1 needs to be considered as v j, and if the radar 2 gait rehabilitation training platform 1 is also moving along the X direction, the moving speed of the radar 2 gait rehabilitation training platform 1 along the X direction also needs to be considered, but the moving speed of the radar 2 gait rehabilitation training platform 1 along the X direction is not considered in the actual long-distance straight walking training mode. In step S30, the calculation formula of the step size is as follows:
Fl=|yc1-(vjΔt+yc2)|
Wherein Y c1 represents the coordinate of one foot of the gait trainer in the Y direction of the radar 2 gait rehabilitation training platform 1 in a certain collection, Y c2 represents the coordinate of the other foot of the gait trainer in the Y direction in a certain collection, the moving speed of the radar 2 gait rehabilitation training platform 1 is set to be V j, and the collection time is recorded as the value range of t j,tj and is 0,1,2 and … n.
In order to calculate whether the patient steps on the midplane footprint and obtain the coincidence rate in the stepping, further, the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether to step on the midplane virtual object and the coincidence rate of the standard footprint in the stepping, and the calculation steps of real-time associated data are as follows:
Step S305: the virtual object standard center point in a certain collection is preset. Wherein the virtual object standard center point is marked as C S(xcs,ycs
Step S306: comparing the relation between the distance between the actual stepping foot center point (C x,Cy) of the actual stepping foot of the patient and the standard center point C S(xcs,ycs in a certain collection and the threshold d m, if the distance between the actual stepping foot center point (C x,Cy) and the virtual object standard center point C S(xcs,ycs is smaller than the threshold d m, the user thinks to step on, and adjusts the color of a screen in the man-machine interaction module 44 or the color of a mark in the screen through the man-machine interaction module 44 of the training evaluation system 4 so as to remind the gait trainer to know the result of each stepping in time, so as to give the gait trainer more training confidence and mark as the standard foot mark in the stepping; if the distance between the center point (C x,Cy) of the actual stepping foot and the virtual object standard center point C S(xcs,ycs is not less than the threshold value d m, the stepping is not considered to be middle.
Wherein, the comparison formula is as follows:
step S307: the coincidence rate P S of the standard footprint in the pedaling of a certain acquisition is calculated, and the coincidence rate P S is an important index for evaluating the pedaling quality of a gait trainer. Wherein, the calculation formula of the coincidence rate P S is as follows:
Step S40: the evaluation module 43 in the training evaluation system 4 inputs the real-time calculation data and the real-time association data into the gait evaluation algorithm, and outputs the evaluation result of the gait trainer after the statistical calculation. Different training modes refer to different training modes, after the gait evaluation algorithm counts and analyzes the real-time calculation data and the real-time association data, the score of a certain game can be output for the patient, and some clearance games can be selectively set for the patient to play, so that the richness of the gait of the patient can be greatly improved after the patient finishes different types of games. In the long-distance straight-line walking training mode, the evaluation result is based on comprehensive evaluation of the actual stepping foot outline, the actual stepping foot center point, the coincidence rate of the standard foot mark in stepping and the straight-line walking training score each time by a gait trainer. In the training pool stepping training mode, the number of times that a gait trainer avoids or steps on a plane virtual object of movement each time in the training pool stepping training scheme is output, comprehensive evaluation is performed on the performance of the plane virtual object under different difficulty levels, and finally the evaluation result of the gait trainer is output after the comprehensive evaluation.
In order to accurately evaluate the game result of the patient, in step S40, the gait evaluation algorithm is implemented as follows:
in step S40, in the long-distance straight-line walking training mode, the straight-line walking training score calculation process in the gait evaluation algorithm is implemented as follows:
S401: presetting a standard step length value, a standard step distance value and a full value; the preset standard step length value is marked as F 1S, the preset standard step length value is marked as F ws, and the full score value is marked as 100 minutes. The step value can be determined according to the height, weight, shoulder width or waistline of the human body, and the full score is set by the user.
S402: the storage module 41 in the training evaluation system 4 records the coincidence rate P S of the standard footprint, the step length F 1 and the step length F w of each stepping of the gait trainer in the straight line walking training, and the calculation module 42 in the training evaluation system 4 calculates the average value of the standard footprint in the steppingAverage value of step lengthAverage of stride
S403: according to the average value of standard footprintAverage value of step lengthAverage of stepsThe gait training person gait Score is calculated as follows:
Example two
In the method for evaluating gait training based on radar provided in the present embodiment, the radar 2 is used to collect gait rehabilitation data of a patient and evaluate the gait rehabilitation data, in this embodiment, the training evaluation system 4 is in an obstacle avoidance training mode, the radar 2 gait rehabilitation training platform 1 is configured as a flat ground or a stationary running platform conveyor belt, the patient can perform an obstacle avoidance game on the moving flat ground or the running platform conveyor belt, and the planar virtual object is configured as a planar obstacle, for example: bomb, fly ball, etc. As shown in fig. 3, the positional relationship between the actual stepping foot of the gait trainer and the planar virtual object is configured as to whether the left or right foot of the gait trainer can avoid the moving planar virtual object. Preferably, the stationary running deck conveyor or the flat ground is provided with a training area, the projector 3 projects a guiding station footprint on the training area, the guiding station footprint comprising a left-right standing virtual footprint, a left-upper-right lower standing virtual footprint and a right-upper-left lower standing virtual footprint, through which the patient can be guided to perform a station in order to prepare for obstacle avoidance, different station modes being able to pertinently help the gait trainer to recover different body parts.
In order to accurately calculate whether to avoid the obstacle successfully, further, the calculation steps of real-time associated data of the position relationship between the actual stepping foot and the plane virtual object of the gait trainer are as follows:
S305: the virtual object standard center point in a certain collection is preset. Wherein the virtual object standard center point is denoted as C O(xco,yco).
S306: comparing the relation between the distance between the actual stepping foot center point (C x,Cy) of the actual stepping foot and the virtual object standard center point marked as C O(xco,yco) and the threshold d mo in a certain collection, and if the distance between the actual stepping foot center point (C x,Cy) and the virtual object standard center point C O(xco,yco is more than the threshold d mo, judging that the avoidance is successful; if the distance between the center point (C x,Cy) of the actual stepping foot and the virtual object standard center point C O(xco,yco is less than or equal to the threshold value d mo, the avoidance is not considered to be successful. Wherein, the comparison formula is as follows:
In step S40, in the training pool stepping training mode, the stepping training score calculation process in the gait evaluation algorithm is implemented as follows:
S404: the weighting coefficients d n of n different difficulty levels are preset. For example: the weighting coefficients d 3 of 3 different difficulty grades are preset, and the weighting coefficients of the different grades are respectively easy, medium and difficult, and refer to the following table.
Sequence number | Difficulty rating | Speed value | Weighting factor (d n) |
1 | Easy to use | Configurable by rehabilitation therapists | 1 |
2 | Medium and medium | Configurable by rehabilitation therapists | 1.5 |
3 | Difficulty in | Configurable by rehabilitation therapists | 2 |
S405: and (3) implementing a complete stepping training under the training pool stepping training mode, recording different difficulty levels, and recording the total evasion success number or total stepping number C n corresponding to the different difficulty levels in the complete stepping training by a gait trainer in the stepping training by a storage module 41 in the training evaluation system 4.
Score=d1C1+d2C2+…+dnCn
Example III
In the method for evaluating gait training based on the radar provided in the embodiment, the radar 2 is used to collect gait rehabilitation data of a patient and evaluate the gait rehabilitation data, in the embodiment, the training evaluation system 4 is in a training pool stepping training mode, the radar 2 gait rehabilitation training platform 1 is configured as a flat ground or stationary running platform conveyor belt, the patient can perform a stepping game on the flat ground or stationary running platform conveyor belt, and the planar virtual object is configured as a planar moving object, for example: balloon, mouse, etc. As shown in fig. 4, the positional relationship between the actual stepping foot of the gait trainer and the planar virtual object is configured such that the gait trainer can strike the planar virtual object. Preferably, a training area and a peripheral area are arranged on a stationary running table conveyor belt or a flat ground, the training area is located in the middle of the peripheral area, the projector 3 projects guide station footprints in the training area, the guide station footprints comprise left and right standing virtual footprints, left and right standing virtual footprints and right and left standing virtual footprints, and a patient can be guided to perform station positions through the guide station footprints so as to prepare for stepping game, so that the rehabilitation effect of different parts is achieved.
In order to accurately calculate whether the stepping is effective, further, the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be capable of stepping on the plane virtual object, and the calculation steps of the position relation real-time associated data of the actual stepping foot of the gait trainer and the plane virtual object are as follows:
S305: presetting a virtual object standard center point in a certain collection; wherein the virtual object standard center point is denoted as C O(xco,yco);
S306: comparing the relation between the distance between the actual stepping foot center point (C x,Cy) of the actual stepping foot and the virtual object standard center point marked as C O(xco,yco in a certain collection and the threshold value d mo, and if the distance between the actual stepping foot center point (C x,Cy) and the virtual object standard center point C O(xco,yco is more than the threshold value d mo, considering that stepping is not performed; if the distance between the center point (C x,Cy) of the actual stepping foot and the virtual object standard center point C O(xco,yco is smaller than or equal to the threshold value d mo, the stepping is considered to be in the middle. Wherein, the comparison formula is as follows:
In step S40, in the training pool stepping training mode, the stepping training score calculation process in the gait evaluation algorithm is implemented as follows:
S404: the weighting coefficients d n of n different difficulty levels are preset. In the stepping training in the training pool stepping training mode, two factors of position coding and detention speed are needed to be considered for weighting. The position code is determined according to the codes of the training area and the peripheral area, and the detention speed is the time difference between the display and disappearance of the plane virtual object and takes seconds as a unit. The stress capability and rehabilitation effect of the rehabilitation person can be well evaluated by setting the position codes of different areas and the detention speed of the plane virtual object.
Difficulty rating | Position coding | Retention rate | Weighting factor (d n) | |
1 | Easy to use | A or B or C or D | 20 Seconds | 1 |
2 | Medium and medium | E or F or G or H | 20 Seconds | 1.5 |
3 | Difficulty in | E or F or G or H | 10 Seconds | 2 |
4 | Medium and medium | A or B or C or D | 15 Seconds | 1.5 |
5 | Difficulty in | A or B or C or D | 5 Seconds | 2 |
S405: and (3) implementing a complete stepping training under the training pool stepping training mode, recording different difficulty levels, and recording the total evasion success number or total stepping number C n corresponding to the different difficulty levels in the complete stepping training by the gait trainer in the stepping training by the storage module 41 in the training evaluation system 4.
Score=d1C1+d2C2+…+dnCn
Example IV
As shown in fig. 5, a radar-based gait training evaluation apparatus for the radar-based gait training evaluation method provided in the fourth embodiment includes a gait rehabilitation training platform 1, a radar 2, a projector 3 and a training evaluation system 4, wherein the radar 2 gait rehabilitation training platform 1 is used for a gait trainer to perform rehabilitation training, the radar 2 is used for collecting foot detection data of the gait trainer, the projector 3 is used for projecting a planar virtual object on the radar 2 gait rehabilitation training platform 1, the training evaluation system 4 is used for providing guidance to the gait trainer to perform rehabilitation training in different rehabilitation modes, the training evaluation system 4 includes a storage module 41, a calculation module 42, an evaluation module 43 and a man-machine interaction module 44, the storage module 41 is used for storing preset coordinate data of the planar virtual object, the calculation module 42 is used for generating real-time calculation data and real-time correlation data, the evaluation module 43 is used for calculating the evaluation result according to the output of the calculation module 42, and the man-machine interaction module 44 is used for inputting control instructions and displaying the evaluation result. The radar 2 is arranged right in front of the state rehabilitation training platform, the detection area of the radar 2 is the upper surface of the rehabilitation training platform and is close to the upper surface of the rehabilitation training platform, so that accuracy of collected data is protected, the projector 3 is located on one side of the state rehabilitation training platform, and the projector 3 projects the projection of the projector to the upper surface of the rehabilitation training platform. In specific use, after the operation environment of the training evaluation system 4 based on the radar 2 gait rehabilitation training platform 1 is built, different training modes are selected through the man-machine interaction module 44, then a plane virtual object is projected on the radar 2 gait rehabilitation training platform 1 through the projector 3 according to the preset training mode, preset coordinate data of the plane virtual object is stored in the storage module 41 of the training evaluation system 4, foot detection data of a gait trainer are acquired through the radar 2, and the acquisition time is recorded, according to the calculation module 42 in the foot detection data training evaluation system 4, one or more of the actual foot center point, the step distance and the step length of the gait trainer are selectively calculated, real-time calculation data are generated, the position relation between the actual foot of the gait trainer and the plane virtual object is judged and calculated according to different training modes in the training evaluation system 4, real-time correlation data are generated, the evaluation module 43 in the training evaluation system 4 inputs the real-time calculation data and the real-time correlation data into the gait evaluation algorithm for statistical calculation, and then the evaluation result of the gait trainer is output and displayed through the man-machine interaction module 44, so that the purpose of enabling the gait trainer to quickly and intuitively acquire the comprehensive evaluation effect is achieved, and certainly, the training evaluation system 4 also supports a user to call the real-time data of the step distance and the step length to be made into an icon, and further analyzes the stepping stability of the patient.
While the application has been described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the spirit and scope of the application. The application is not to be limited by the specific embodiments disclosed herein, but rather, embodiments falling within the scope of the appended claims are intended to be embraced by the application.
Claims (9)
1. A radar-based gait training evaluation method, which is characterized by comprising the following steps:
S00: setting up an operation environment of a training evaluation system; in step S00, the training evaluation system is electrically connected with a radar gait rehabilitation training platform, the radar gait rehabilitation training platform is controlled by the training evaluation system, the operation environment setting-up step includes that after the training evaluation system is started, a gait trainer selects one of a long-distance linear walking training mode or a training pool stepping training mode through a man-machine interaction module of the training evaluation system, when the training evaluation system is configured into the long-distance linear walking training mode, the radar gait rehabilitation training platform is configured as a running table conveyor belt for movement, and when the training evaluation system is configured into the training pool stepping training mode, the radar gait rehabilitation training platform is configured as a running table conveyor belt or a flat ground;
S10: after a gait trainer selects a training mode, a projector projects a plane virtual object on the radar gait rehabilitation training platform, and preset coordinate data of the plane virtual object is stored in a storage module of the training evaluation system; in step S10, the projection range of the planar virtual object is configured to have different difficulty levels in the long-distance straight walking training mode and the training pool stepping training mode within a preset range taking the standing position of the gait trainer as a central coordinate, and the training evaluation system adjusts the difficulty levels of the different training modes by adjusting the distance between the planar virtual object and the gait trainer, the moving speed of the planar virtual object and the throwing quantity of the planar virtual object;
S20: acquiring foot detection data of the moment that a gait trainer steps on the radar gait rehabilitation training platform through a radar and recording acquisition time; the two-dimensional point set corresponding to the foot detection data is recorded as { P ij, (i=0, 1,2, …, n) }, and the acquisition time is recorded as t j;
S30: the calculation module in the training evaluation system calculates the actual stepping foot outline and the actual stepping foot center point of a gait trainer according to the foot detection data and generates real-time calculation data, and under different training modes, the training evaluation system judges and calculates the position relationship between the real-time calculation data and the virtual object outline and the virtual object standard center point of the plane virtual object and generates real-time associated data;
S40: an evaluation module in the training evaluation system inputs the real-time calculation data and the real-time association data into a gait evaluation algorithm to output an evaluation result of a gait trainer after calculation;
in step S30, the calculation step of the center point of the actual stepping foot is as follows:
S301: acquiring an edge point set of the outline of the actual stepping foot according to the two-dimensional point set; wherein T ij is a subset of P ij, and the set of edge points of the actual stepping foot profile is denoted as { T ij, (i=0, 1,2, … n) };
S302: the computing module fits and simplifies the edge point set of the outline of the actual stepping foot, and generates a simplified actual stepping foot polygon and a simplified actual stepping foot vertex set { Q ij, (i=0, 1,2, … n) };
s303: calculating footprint area according to the simplified actual stepping foot vertex set; wherein, the simplified actual stepping foot vertex set comprises n vertices, and the calculation formula of the actual stepping foot area A is as follows:
Where x i is the x-coordinate of the i-th point in Q ij, y i+1 is the y-coordinate of the i+1th point in Q ij, x i+1 is the x-coordinate of the i+1th point in Q ij, and y i is the y-coordinate of the i-th point in Q ij;
s304: calculating the center point (C x,Cy) of the actual stepping foot; the calculation formula of the center point of the actual stepping foot is as follows:
Where x i is the x-coordinate of the i-th point in Q ij, y i+1 is the y-coordinate of the i+1th point in Q ij, x i+1 is the x-coordinate of the i+1th point in Q ij, and y i is the y-coordinate of the i-th point in Q ij.
2. The radar-based gait training evaluation method according to claim 1, wherein:
In step S301, a discrimination formula of the edge point set { T ij, (i=0, 1,2, … n) } of the actual stepping foot profile is obtained according to the two-dimensional point set as follows:
Where j represents the acquisition at the jth acquisition or at time t j, Representing the x-coordinate of the i-1 th point in the j-th acquisition neutron set T ij,Representing the y-coordinate of the i-1 th point in the j-th acquisition neutron set T ij,Representing the x-coordinate of the ith point in the jth acquisition neutron set T ij,Representing the y-coordinate of the ith point in the jth acquisition neutron set T ij,Representing the x-coordinate of the ith point in the jth acquisition neutron set P ij,Representing the y-coordinate of the ith point in the jth acquisition neutron set P ij.
3. The radar-based gait training evaluation method according to claim 1, wherein:
in step S30, the step distance is calculated as follows:
Fd=|xc1-xc2|
Wherein X c1 represents the coordinate of one foot of the gait trainer in the X direction of the radar gait rehabilitation training platform in a certain acquisition, and X c2 represents the coordinate of the other foot of the gait trainer in the X direction of the radar gait rehabilitation training platform in a certain acquisition;
In step S30, the calculation formula of the step size is as follows:
Fl=|yc1-(vjΔt+yc2)|
Wherein Y c1 represents the coordinate of one foot of a gait trainer in the Y direction of the radar gait rehabilitation training platform in a certain acquisition, Y c2 represents the coordinate of the other foot of the gait trainer in the Y direction of the radar gait rehabilitation training platform in a certain acquisition, the moving speed of the radar gait rehabilitation training platform is set to be V j, and the acquisition time is recorded as the value range of t j,tj and is 0,1,2 and … m.
4. The radar-based gait training evaluation method according to claim 1, wherein:
in the long-distance straight walking training mode, the plane virtual object is configured as a standard footprint, and the position relationship between the actual stepping foot of the gait trainer and the plane virtual object is configured as whether to step on the standard footprint and the coincidence rate of the standard footprint in the stepping;
the calculation steps of the real-time associated data are as follows:
s305: presetting the virtual object standard center point in a certain acquisition; wherein the virtual object standard center point is denoted as C S(xcs,ycs);
S306: comparing the relation between the distance between the actual stepping foot center point (C x,Cy) of the actual stepping foot and the virtual object standard center point C S(xcs,ycs in a certain collection and the threshold value d m, and if the distance between the actual stepping foot center point (C x,Cy) and the virtual object standard center point C S(xcs,ycs is smaller than the threshold value d m, marking the pedal as a standard footprint in the pedal; if the distance between the center point (C x,Cy) of the actual stepping foot and the virtual object standard center point C S(xcs,ycs is not less than the threshold value d m, the stepping is not considered; wherein, the comparison formula is as follows:
S307: calculating the coincidence rate P S of the standard footprint in the pedaling process; wherein, the calculation formula of the coincidence rate P S is as follows:
5. a radar-based gait training assessment method according to claim 3, wherein:
In the training pool stepping training mode, the position relationship between the actual stepping foot of the gait trainer and the plane virtual object is configured as to whether the left foot or the right foot of the gait trainer can avoid the plane virtual object moving or whether the position relationship between the actual stepping foot of the gait trainer and the plane virtual object is configured as to whether the plane virtual object can be stepped on, and the calculation steps of the real-time associated data are as follows:
S305: presetting the virtual object standard center point in a certain acquisition; wherein the virtual object standard center point is denoted as C O(xco,yco);
S306: comparing the relation between the distance between the actual stepping foot center point (C x,Cy) of the actual stepping foot and the virtual object standard center point C O(xco,yco in a certain collection and the threshold value d mo, and if the distance between the actual stepping foot center point (C x,Cy) and the virtual object standard center point is marked as C O(xco,yco) is more than the threshold value d mo, the user is considered to be successfully avoided or not stepped on; if the distance between the center point (C x,Cy) of the actual stepping foot and the virtual object standard center point C O(xco,yco is less than or equal to the threshold value d mo, the stepping is not successfully avoided or performed; wherein, the comparison formula is as follows:
6. the radar-based gait training evaluation method according to claim 5, wherein:
The radar gait rehabilitation training platform is configured as a stationary running platform conveyor belt or a flat ground;
When the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the left foot or the right foot of the gait trainer can avoid the moving plane virtual object, a training area is arranged on the motionless treadmill conveyor belt or the flat ground, and the projector projects a guide station footprint on the training area; and/or when the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the plane virtual object can be stepped on, a training area and a peripheral area are arranged on the motionless running platform conveyor belt or the plane, the training area is positioned in the middle of the peripheral area, and the projector projects a guide station footprint in the training area.
7. The radar-based gait training evaluation method according to claim 4, wherein:
In step S40, in the long-distance straight-line walking training mode, the straight-line walking training score calculation process in the gait evaluation algorithm is implemented as follows:
S401: presetting a standard step length value, a standard step distance value and a full value; wherein the preset standard step length value is marked as F 1S, the preset standard step length value is marked as F ws, and the full score value is marked as 100 minutes;
s402: a complete walking training is implemented in a long-distance straight walking training mode, a storage module in the training evaluation system records the coincidence rate P S of standard marks in each stepping of the gait trainer in the straight walking training, the step length F 1 and the step length F w, and a calculation module in the training evaluation system calculates the average value of the standard marks in each stepping Average value of step lengthAverage of stride
S403: according to the average value of standard footprint in the pedalAverage of the stepsAverage of stepsThe gait training person gait Score is calculated as follows:
8. the radar-based gait training evaluation method according to claim 5, wherein:
In step S40, in the training pool stepping training mode, the stepping training score calculation process in the gait evaluation algorithm is implemented as follows:
S404: presetting weighting coefficients d n of n different difficulty levels;
S405: implementing a complete stepping training under a training pool stepping training mode, recording different difficulty levels, and recording the total evasion success number or total stepping number C n corresponding to the different difficulty levels in the complete stepping training by the gait trainer in the stepping training by a storage module in the training evaluation system;
Score=d1C1+d2C2+…+dnCn.
9. a radar-based gait training evaluation apparatus for use in the radar-based gait training evaluation method of any one of claims 1 to 7, comprising:
Radar gait rehabilitation training platform; for rehabilitation training of the gait trainer;
a radar; the foot detection data acquisition module is used for acquiring foot detection data of a gait trainer;
a projector; for projecting the planar virtual object on the radar gait rehabilitation training platform;
The training evaluation system is used for guiding the gait trainer to carry out rehabilitation training under different rehabilitation modes and comprises a storage module, a calculation module, an evaluation module and a man-machine interaction module, wherein the storage module is used for storing preset coordinate data of the plane virtual object, the calculation module is used for generating real-time calculation data and real-time association data, the evaluation module is used for calculating an evaluation result according to the output of the calculation module, and the man-machine interaction module is used for inputting control instructions and displaying the evaluation result.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110497172.4A CN113197571B (en) | 2021-05-07 | 2021-05-07 | Gait training evaluation method and device based on radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110497172.4A CN113197571B (en) | 2021-05-07 | 2021-05-07 | Gait training evaluation method and device based on radar |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113197571A CN113197571A (en) | 2021-08-03 |
CN113197571B true CN113197571B (en) | 2024-07-19 |
Family
ID=77029589
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110497172.4A Active CN113197571B (en) | 2021-05-07 | 2021-05-07 | Gait training evaluation method and device based on radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113197571B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113599773B (en) * | 2021-09-22 | 2022-09-09 | 上海海压特智能科技有限公司 | Gait rehabilitation training system and method based on rhythmic visual stimulation |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109758157A (en) * | 2019-01-29 | 2019-05-17 | 广州晓康医疗科技有限公司 | Gait rehabilitation training evaluation method and system based on augmented reality |
CN111603171A (en) * | 2020-06-03 | 2020-09-01 | 上海金矢机器人科技有限公司 | Gait parameter determination method and system for lower limb rehabilitation |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2760484B2 (en) * | 1995-12-19 | 1998-05-28 | アニマ株式会社 | Floor reaction force measuring device |
CN101327126A (en) * | 2008-07-23 | 2008-12-24 | 天津大学 | Morphological Feature Extraction Method of Human Bare Footprint |
CN102038504B (en) * | 2009-10-19 | 2012-02-22 | 上海理工大学 | Back Support Weight Loss Wave Way Balance Assessment Training Method |
ITMI20130886A1 (en) * | 2013-05-30 | 2014-12-01 | Baro Postural Instr Srl | SYSTEM AND METHOD FOR DETECTION OF BAROPOSTURAL PARAMETERS |
CN104298243B (en) * | 2014-08-19 | 2017-02-15 | 北京理工大学 | Humanoid robot uneven ground walking stability control method |
JP6554996B2 (en) * | 2015-08-17 | 2019-08-07 | トヨタ自動車株式会社 | Walking training apparatus and walking training method thereof |
CN106095201B (en) * | 2016-05-30 | 2018-10-09 | 安徽慧视金瞳科技有限公司 | A kind of double-click detection method of projection interactive system |
RU2645002C2 (en) * | 2016-07-12 | 2018-02-15 | Федеральное государственное бюджетное учреждение науки Сибирский федеральный научный центр агробиотехнологий Российской академии наук (СФНЦА РАН) | Method for determining complex of parameters of cross section of objects of quasi-cylindrical form |
KR20180079786A (en) * | 2017-01-02 | 2018-07-11 | 한국생산기술연구원 | Device for gait rehabilitation |
CN107038424B (en) * | 2017-04-20 | 2019-12-24 | 华中师范大学 | A gesture recognition method |
CN108037505A (en) * | 2017-12-08 | 2018-05-15 | 吉林大学 | A kind of night front vehicles detection method and system |
CN108939511B (en) * | 2018-07-18 | 2020-10-30 | 广州晓康医疗科技有限公司 | Limb rehabilitation training method and system based on virtual reality |
CN109589557B (en) * | 2018-11-29 | 2021-05-04 | 广州晓康医疗科技有限公司 | Upper limb rehabilitation training system and evaluation method based on virtual reality environment double competition |
CN110414333B (en) * | 2019-06-20 | 2024-07-19 | 平安科技(深圳)有限公司 | Image boundary detection method and device |
CN111420348A (en) * | 2019-12-24 | 2020-07-17 | 广州晓康医疗科技有限公司 | Gait rehabilitation training instrument and using method thereof |
CN112617837B (en) * | 2021-01-05 | 2024-01-05 | 河北雄安乐动健康科技有限公司 | Human lower limb endurance assessment method and device |
-
2021
- 2021-05-07 CN CN202110497172.4A patent/CN113197571B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109758157A (en) * | 2019-01-29 | 2019-05-17 | 广州晓康医疗科技有限公司 | Gait rehabilitation training evaluation method and system based on augmented reality |
CN111603171A (en) * | 2020-06-03 | 2020-09-01 | 上海金矢机器人科技有限公司 | Gait parameter determination method and system for lower limb rehabilitation |
Also Published As
Publication number | Publication date |
---|---|
CN113197571A (en) | 2021-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210228959A1 (en) | Kinect-based auxiliary training system for basic badminton movements | |
CN109758157B (en) | Gait rehabilitation training evaluation method and system based on augmented reality | |
JP6733738B2 (en) | MOTION RECOGNITION DEVICE, MOTION RECOGNITION PROGRAM, AND MOTION RECOGNITION METHOD | |
CN111111111B (en) | A real-time fitness monitoring system and method | |
CN112464918B (en) | Body-building action correcting method and device, computer equipment and storage medium | |
US6533675B2 (en) | Interactive method and apparatus for tracking and analyzing a golf swing | |
CN108764120A (en) | A kind of human body specification action evaluation method | |
CN108986884A (en) | The training system and method that a kind of balanced rehabilitation and cognitive rehabilitation blend | |
CN105451829A (en) | Personal digital trainer for physiotheraputic and rehabilitative video games | |
CN105451827A (en) | Rehabilitative posture and gesture recognition | |
US20170151500A9 (en) | Personal digital trainer for physiotheraputic and rehabilitative video games | |
CN105451828A (en) | Report system for physiotherapeutic and rehabilitative video games | |
KR20170010157A (en) | Method for guiding actions of an user in interactive exercise programs and the apparatus thereof | |
CN113197571B (en) | Gait training evaluation method and device based on radar | |
CN112933581A (en) | Sports action scoring method and device based on virtual reality technology | |
CN113663312A (en) | A micro-inertia-based method for evaluating the quality of fitness movements without equipment | |
CN113137923A (en) | Measuring method of standing long jump performance | |
CN113409651A (en) | Live broadcast fitness method and system, electronic equipment and storage medium | |
CN110227243A (en) | Table tennis practice intelligent correcting system and its working method | |
EP3179446A1 (en) | Orientation estimation method, and orientation estimation device | |
CN114092971B (en) | A human motion assessment method based on visual images | |
Pai et al. | Home Fitness and Rehabilitation Support System Implemented by Combining Deep Images and Machine Learning Using Unity Game Engine. | |
CN105833475B (en) | A kind of device and method of treadmill parameter setting | |
CN115862810A (en) | VR rehabilitation training method and system with quantitative evaluation function | |
CN115565637A (en) | A hand rehabilitation training method and device based on VR technology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |