[go: up one dir, main page]

CN106344030A - Posture correction method and device - Google Patents

Posture correction method and device Download PDF

Info

Publication number
CN106344030A
CN106344030A CN201610756271.9A CN201610756271A CN106344030A CN 106344030 A CN106344030 A CN 106344030A CN 201610756271 A CN201610756271 A CN 201610756271A CN 106344030 A CN106344030 A CN 106344030A
Authority
CN
China
Prior art keywords
joint detection
joint
point
posture
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610756271.9A
Other languages
Chinese (zh)
Inventor
于邦仲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SUZHOU PINNUO NEW MEDICAL TECHNOLOGY Co Ltd
Original Assignee
SUZHOU PINNUO NEW MEDICAL TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SUZHOU PINNUO NEW MEDICAL TECHNOLOGY Co Ltd filed Critical SUZHOU PINNUO NEW MEDICAL TECHNOLOGY Co Ltd
Priority to CN201610756271.9A priority Critical patent/CN106344030A/en
Publication of CN106344030A publication Critical patent/CN106344030A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6822Neck
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6823Trunk, e.g., chest, back, abdomen, hip
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7455Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention provides a posture correction method and device. The posture correction method includes: determining at least two joint testing points and determining target joint points from the joint testing points; determining a standard posture set containing standard included angles formed by every two joint testing points and the target joint point; determining a current posture set of target objects, wherein the current posture set contains current included angles formed by every two joint testing points and the target joint point; according to the standard included angles in the standard posture set and the current included angles of the current posture set, calculating matching degree of the target objects; judging whether the matching degree is smaller than preset matching thresholds or not, and if yes, feeding back correction information. By the arrangement, accuracy in posture correction can be effectively improved.

Description

Posture correction method and device
Technical Field
The invention relates to the technical field of medical instruments, in particular to a posture correction method and device.
Background
Along with the fact that people with bone diseases such as cervical spondylosis, scapulohumeral periarthritis and lumbar spondylosis are increasingly huge, people pay more attention to correction of bad postures such as sitting postures.
The existing poor posture correction method mainly corrects the poor posture by means of physical restraint, the physical restraint mainly restrains the posture behavior of a wearer by a specific wearable device such as a correction belt, and the wearable device deforms to a certain extent along with the increase of the wearing time, the restraint performance is reduced, and once the restraint performance is reduced, the accuracy of posture correction is reduced.
Disclosure of Invention
The embodiment of the invention provides a posture correction method and device, which can effectively improve the accuracy of posture correction.
A method of posture correction, comprising:
determining at least two joint detection points, and determining a target joint point in the at least two joint detection points;
determining a standard posture set, wherein the standard posture set comprises standard included angles formed by every two joint detection points and the target joint point;
determining a current posture set of a target object, wherein the current posture set comprises current included angles formed by every two joint detection points and the target joint point;
calculating the matching degree of the target object according to the standard included angle in the standard posture set and the current included angle in the current posture set;
and judging whether the matching degree is smaller than a preset matching threshold value or not, and if so, feeding back correction information.
Preferably, the determining a set of standard gestures comprises:
setting a coordinate system;
in a standard posture, acquiring a standard coordinate of each joint detection point in a target object in the coordinate system;
calculating the distance between every two joint detection points according to the standard coordinate of each joint detection point in the coordinate system;
calculating a standard included angle formed by every two joint detection points in the target object and the target joint point according to a first formula;
combining standard included angles formed by every two joint detection points and the target joint point into a standard posture set;
the first formula:
wherein, the αijRepresenting included angles formed by the joint detection point i, the joint detection point j and the target joint point o; the above-mentionedCharacterizing a distance between a joint detection point i and a target joint point o in a target object, saidCharacterizing a distance between a joint detection point j and a target joint point o in a target object, saidAnd characterizing the distance between the joint detection point i and the target joint point j in the target object.
Preferably, the determining the current set of gestures of the target object comprises:
under the current posture, acquiring the current coordinate of each joint detection point in the target object in the coordinate system;
calculating the distance between every two joint detection points according to the current coordinate of each joint detection point;
calculating the current included angle formed by every two joint detection points and the target joint point in the target object according to a first formula;
and combining current included angles formed by every two joint detection points and the target joint point into a current posture set.
Preferably, the calculating the distance between each two joint detection points comprises:
calculating the distance between every two joint detection points according to a second formula;
the second formula:
wherein,representing a distance value x between a joint detection point i and a joint detection point j in the target objecti,yi,ziRespectively representing coordinate values of joint detection points i on an x axis, a y axis and a z axis, xj,yj,zjAnd respectively representing the coordinates of the joint detection point j on an x-axis y-axis and a z-axis.
Preferably, the calculating the matching degree of the target object includes:
calculating the matching degree of the target object according to a third formula;
the third formula:
ω = e - Σ | α i j - α i j ′ | N × 100 %
wherein, the omega represents the matching degree, the αijRepresenting a standard included angle formed by a joint detection point i and a joint detection point j in the standard posture array and a target joint point o αij' characterize the sum α in the current set of gesturesijAnd N represents the number of the included angles in the posture array.
Preferably, the above method further comprises: setting a timer;
after the feeding back the correction information, further comprising: starting the timer to time, and executing the current posture set of the determined target object when the time reaches a preset time threshold.
Preferably, the target object includes: any one or more of cervical vertebra, lumbar vertebra and shoulder.
A posture improvement device, comprising: a determination unit, a posture detection unit, a processing unit and a feedback unit, wherein,
the determining unit is used for determining at least two joint detection points and determining a target joint point in the at least two joint detection points;
the gesture detection unit is configured to determine a standard gesture set, where the standard gesture set includes a standard included angle formed by every two joint detection points determined by the determination unit and the target joint point, and determines a current gesture set of the target object, and the current gesture set includes a current included angle formed by every two joint detection points determined by the determination unit and the target joint point;
the processing unit is used for calculating the matching degree of the target object according to the standard included angle in the standard posture set determined by the posture detection unit and the current included angle in the current posture set, judging whether the matching degree is smaller than a preset matching threshold value or not, and if not, triggering the feedback unit;
and the feedback unit is used for feeding back the correction information when the trigger of the processing unit is received.
Preferably, the posture detecting unit includes:
a coordinate system setting subunit, configured to set a coordinate system;
the acquisition subunit acquires coordinates of each joint detection point in the target object in the coordinate system set by the coordinate system setting subunit when the target object is in a standard posture;
the first calculating subunit is used for calculating the distance between every two joint detection points according to the standard coordinates of each joint detection point acquired by the acquiring subunit in the coordinate system, and calculating the standard included angle formed by every two joint detection points in the target object and the target joint point according to the following first formula;
the first formula:
wherein, the αijRepresenting included angles formed by the joint detection point i, the joint detection point j and the target joint point o; the above-mentionedCharacterizing a distance between a joint detection point i and a target joint point o in a target object, saidCharacterizing a distance between a joint detection point j and a target joint point o in a target object, saidAnd characterizing the distance between the joint detection point i and the target joint point j in the target object.
And the construction subunit is used for combining all the standard included angles obtained by the calculation of the first calculation subunit into a standard posture set.
Preferably, the acquiring subunit is further configured to acquire, when the target object is in the current posture, current coordinates of each joint detection point in the target object in the coordinate system;
the first calculating subunit is further configured to calculate a distance between every two joint detection points according to the current coordinate of each joint detection point acquired by the acquiring subunit, and calculate a current included angle formed by every two joint detection points in the target object and the target joint point according to a first formula;
the constructing subunit is further configured to combine the current included angles calculated by the first calculating subunit into a current posture set.
Preferably, the first calculating subunit is configured to calculate a distance between each two joint detection points according to a second formula;
the second formula:
wherein,representing a distance value x between a joint detection point i and a joint detection point j in the target objecti,yi,ziRespectively representing coordinate values of joint detection points i on an x axis, a y axis and a z axis, xj,yj,zjAnd respectively representing the coordinates of the joint detection point j on an x-axis y-axis and a z-axis.
Preferably, the processing unit includes:
the second calculating subunit is used for calculating the matching degree of the target object according to a third formula;
the third formula:
ω = e - Σ | α i j - α i j ′ | N × 100 %
wherein, the omega represents the matching degree, the αijRepresenting a standard included angle formed by a joint detection point i and a joint detection point j in the standard posture array and a target joint point o αij' characterize the sum α in the current set of gesturesijAnd N represents the number of the included angles in the posture array.
Preferably, the above apparatus further comprises: a timer, wherein the timer is, among other things,
the feedback unit is used for starting the timer;
the timer is used for timing, and when the timing reaches a preset timing threshold value, the acquisition subunit is triggered;
the acquisition subunit is further configured to, when receiving the trigger of the timer, perform acquisition of the current coordinates of each joint detection point in the target object in the coordinate system.
Preferably, the gesture detection unit is integrated in a first device, and the determination unit, the processing unit and the feedback unit are integrated in a second device.
Preferably, the determination unit and the gesture detection unit are integrated in a first device, and the processing unit and the feedback unit are integrated in a second device.
Preferably, the determination unit, the posture detection unit, and the processing unit are integrated in a first device, and the feedback unit is integrated in a second device.
Preferably, the determining unit, the detecting unit, the processing unit and the feedback unit are integrated in a first device.
The embodiment of the invention provides a posture correction method and a posture correction device, wherein the posture correction method comprises the steps of determining at least two joint detection points, and determining a target joint point in the at least two joint detection points; determining a standard posture set, wherein the standard posture set comprises standard included angles formed by every two joint detection points and a target joint point; determining a current posture set of the target object, wherein the current posture set comprises current included angles formed by every two joint detection points and the target joint point; calculating the matching degree of the target object according to the standard included angle in the standard posture set and the current included angle in the current posture set; and judging whether the matching degree is smaller than a preset matching threshold value or not, and if so, feeding back correction information. When a posture such as a sitting posture is determined, an included angle formed between two joint detection points on a target object such as a lumbar vertebra and a target joint point is fixed, namely once the current posture is different from a standard posture, the current included angle formed between the two joint detection points and the target joint point is different from the standard included angle, and then whether the current posture needs to be corrected or not can be accurately judged by the posture correction method without being influenced by the external shape of wearing equipment and the like, and the posture correction accuracy can be effectively improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flow chart of a method for posture correction according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method for posture correction according to another embodiment of the present invention;
FIG. 3 is a schematic plan view of an included angle formed by joint detection points in a lumbar spine portion corresponding to a standard posture and a current posture according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a posture improvement device, according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of a posture-correcting device according to another embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a posture-correcting device according to yet another embodiment of the present invention;
fig. 7 is a schematic structural diagram of a posture-correcting device according to another embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer and more complete, the technical solutions in the embodiments of the present invention will be described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention, and based on the embodiments of the present invention, all other embodiments obtained by a person of ordinary skill in the art without creative efforts belong to the scope of the present invention.
As shown in fig. 1, an embodiment of the present invention provides a posture correction method, which may include the steps of:
step 101: determining at least two joint detection points, and determining a target joint point in the at least two joint detection points;
in this step, the determined at least two joint detection points are any joint detection points in joints included in the target object, such as cervical vertebra, lumbar vertebra, and the like, and the target joint point determined in this step may be set according to practical application, but in the determination process of the standard posture set and the current posture set, the target joint point is the same joint detection point.
Step 102: determining a standard posture set, wherein the standard posture set comprises standard included angles formed by every two joint detection points and a target joint point;
for example, 5 lumbar vertebrae in the spine support lumbar movements, 4 joints are contained between the 5 lumbar vertebrae, joint detection points are determined among the 4 joints, one joint detection point is determined as a target joint point, and the standard posture set mainly comprises standard included angles formed by every 2 joint detection points and the target joint point in 3 joint detection points when the waist maintains a standard posture such as a standard sitting posture or a standard standing posture.
Step 103: determining a current posture set of the target object, wherein the current posture set comprises current included angles formed by every two joint detection points and the target joint point;
for example, 5 lumbar vertebrae in the spine comprise 4 joints, and the joint detection points and the target joint points are determined in the 4 joints, and the current posture set mainly comprises current included angles formed by every two joint detection points in the 4 joint detection points and the target joint points when the waist is in the current posture.
Step 104: calculating the matching degree of the target object according to the standard included angle in the standard posture set and the current included angle in the current posture set;
when the target object such as lumbar vertebra is in different postures, included angles formed by any two corresponding joint detection points and the target joint point are different, and then whether the posture of the target object is standard or not can be determined by matching the current included angle and the standard included angle.
Step 105: judging whether the matching degree is smaller than a preset matching threshold value, if so, executing a step 106; otherwise, go to step 103;
step 106: and feeding back the correction information.
In an embodiment of the present invention, in order to accurately calculate a standard included angle formed by each two joint detection points and a target joint point to form a standard posture set, a specific implementation manner of step 102 includes: setting a coordinate system, and acquiring a standard coordinate of each joint detection point in a target object in the coordinate system in a standard posture; calculating the distance between every two joint detection points according to the standard coordinates of each joint detection point in the coordinate system; calculating a standard included angle formed by every two joint detection points in the target object and the target joint point according to a formula (1); combining standard included angles formed by every two joint detection points and the target joint point into a standard posture set;
wherein, αijRepresenting included angles formed by the joint detection point i, the joint detection point j and the target joint point o;the distance between the joint detection point i and the target joint point o in the target object is represented,representing the distance between the joint detection point j and the target joint point o in the target object,and characterizing the distance between the joint detection point i and the target joint point j in the target object. For example: the distance between the joint detection point 1 and the target joint point o at the lumbar vertebra part is 1, the distance between the joint detection point 2 and the target joint point o is 2, and the distance between the joint detection point 1 and the joint detection point 2 is 1α calculated according to formula (1)12Is composed ofα formed by the joint detection point 1 and the joint detection point 3 and the target joint point o are calculated correspondingly13Is composed ofα formed by joint detection point 2, joint detection point 3 and target joint point o23Is composed ofThen, the standard set of gestures is
In an embodiment of the present invention, the detailed implementation of step 103 includes: acquiring the current coordinate of each joint detection point in the target object in the coordinate system under the current posture of the target object; calculating the distance between every two joint detection points according to the current coordinate of each joint detection point; calculating the current included angle formed by every two joint detection points and the target joint point in the target object according to a first formula; combining the current included angles formed by every two joint detection points and the target joint point into a current posture set, for example: the distance between the joint detection point 1 and the target joint point o is equal toThe distance between the joint detection point 2 and the target joint point o is 2, the distance between the joint detection point 1 and the joint detection point 2 is 1, and then the current state obtained by calculation according to the formula (1) is α12Corresponding α12' isAccordingly, α is calculated13Corresponding α13' isAnd α23Corresponding α23' isThen, the current gesture is set to
In an embodiment of the present invention, the specific process of calculating the distance between every two joint detection points according to the coordinates of each joint detection point includes: calculating the distance between every two joint detection points according to the formula (2);
wherein,representing a distance value x between a joint detection point i and a joint detection point j in the target objecti,yi,ziRespectively representing coordinate values of joint detection points i on an x axis, a y axis and a z axis, xj,yj,zjRespectively representing coordinates of joint detection points j on an x axis, a y axis and a z axis; for example: in the current cervical posture, the coordinates of the target joint point are (1, 0, 0), the coordinates of the joint detection point 1 are (1, 1, 0), and the coordinates of the joint detection point 2 are (1, 2, 0), so that the distance between the joint detection point 1 and the target joint point o:distance between joint detection point 2 and target joint point o:distance between joint detection point 1 and joint detection point 2:
in an embodiment of the present invention, in order to accurately calculate the matching degree, the specific implementation manner of step 104 includes: calculating a matching degree of the target object according to the following formula (3);
ω = e - Σ | α i j - α i j ′ | N × 100 % - - - ( 3 )
wherein omega represents the degree of matching αijRepresenting a standard included angle formed by a joint detection point i and a joint detection point j in the standard posture array and a target joint point o αij' characterize the sum α in the current set of gesturesijAnd N represents the number of the included angles in the posture array. For example: according toAndcalculating the matching degree by using the formula (3) as follows:since the matching degree is related to the difference between the included angles formed by every two joint detection points and the target node in the standard posture and the current posture, when the difference value of the included angles is smaller, the matching degree of the two joint detection points is higher, namely the matching degree calculated by the embodiment is higher, and the matching degree between the joint detection points is higherThe higher the degree of matching, i.e. the closer the current posture is to the standard posture.
In an embodiment of the present invention, in order to ensure real-time feedback of correction information and real-time detection, the method further includes: setting a timer, after step 106, further comprising: starting a timer for timing, and when the timing reaches a preset timing threshold, executing step 103, for example, when the timing threshold is 5s or 10s, and when the timer reaches 5s or 10s, determining the current posture array of the target object again, thereby realizing real-time reminding of the user for correction, and in addition, avoiding confusion of the program running process through the timer.
In one embodiment of the invention, the target object comprises: any one or more of cervical vertebra, lumbar vertebra and shoulder.
In another embodiment of the present invention, a posture correction method is developed by taking a target object as a lumbar region and correcting a posture of the lumbar region as an example, and as shown in fig. 2, the method may include the following steps:
step 200: setting a coordinate system, a matching threshold and a timer;
step 201: determining at least two joint detection points at the lumbar vertebra part, and determining a target joint point in the at least two joint detection points;
the lumbar vertebra part contains 5 vertebrae, and every two vertebrae junctions are the joint, determine a joint check point in every joint department, determine 4 joint check points in the lumbar vertebra part then.
Step 202: in a standard posture, acquiring a standard coordinate of each joint detection point in the lumbar vertebra part in a coordinate system;
in this step, the standard postures may be classified into a standard posture of a lumbar region when standing, a standard posture of a lumbar region when lying down, a standard posture of a lumbar region when sitting still, and a standard posture of a lumbar region with a certain rotation angle of the waist, and the standard postures may be set for different users such that the standard postures are collected for the users before posture correction is performed, and in a coordinate system, each joint detection point of the collected lumbar region in the standard postures has a coordinate value, and as shown in fig. 3A, one of the standard postures of the lumbar region has a coordinate value, and each joint detection point coordinate is: the coordinates of the joint detection point 0 are (0, 1, 0), the coordinates of the joint detection point 1 are (0, 2, 0), the coordinates of the joint detection point 2 are (0, 4, 0), and the coordinates of the joint detection point 3 are (1, 5, 0), wherein the joint detection point 0 is the target joint point determined in step 201.
Step 203: calculating the distance between every two joint detection points according to the standard coordinates of each joint detection point in the coordinate system;
in the step, the distance between every two joint detection points is calculated mainly according to the formula (2);
then, the distance between the joint detection point 0 (target joint point o) and the joint detection point 1 is calculated by the formula (2)The distance between the joint detection point 0 (target joint point o) and the joint detection point 2And so on, the distance between the joint detection point 0 (target joint point o) and the joint detection point 3Distance between joint detection point 1 and joint detection point 2Distance between joint detection point 1 and joint detection point 3Distance between joint detection point 2 and joint detection point 3
Step 204: calculating a standard included angle formed by every two joint detection points in the lumbar vertebra part and a target joint point;
in the step, a standard included angle formed by every two joint detection points and a target joint point in the lumbar vertebra part is calculated mainly according to a formula (1);
wherein, αijRepresenting included angles formed by the joint detection point i, the joint detection point j and the target joint point o;the distance between the joint detection point i and the target joint point o in the target object is represented,representing the distance between the joint detection point j and the target joint point o in the target object,and characterizing the distance between the joint detection point i and the target joint point j in the target object.
Then, the result calculated in step 203 Andsubstituting formula (1) to calculate joint detection point 1 and joint closureThe included angle α formed by the joint detection point 2 and the target joint point o12The joint detection point 1 and the joint detection point 3 form an included angle α with the target joint point o13And the included angle α formed by the joint detection point 2, the joint detection point 3 and the target joint point o23Wherein, α120; sequentially calculate to obtain
Step 205: combining standard included angles formed by every two joint detection points and the target joint point into a standard posture set;
then, as a result of the calculation in step 204, the set of standard angles is:
step 206: collecting the coordinates of each joint detection point of the lumbar vertebra part in a coordinate system under the current posture of the lumbar vertebra part;
the lumbar spine part in the current posture as shown in fig. 3B has the following coordinates in the coordinate system at the respective joint detection points: the coordinates of the joint detection point 0 (target joint point) are (1, 1, 0), the coordinates of the joint detection point 1 are (2, 2, 1), the coordinates of the joint detection point 2 are (2, 4, 0), and the coordinates of the joint detection point 3 are (3, 4, 1).
Step 207: calculating the distance between every two joint detection points according to the current coordinate of each joint detection point;
calculated according to formula (2) given in step 203 And
step 208: calculating a current included angle formed by every two joint detection points in the target object and the target joint point;
calculating according to formula (1) given in step 204 to obtain included angle α formed by joint detection point 1, joint detection point 2 and target joint point o12' and the joint detection point 1 and the joint detection point 3 form an included angle α with the target joint point o13' and the included angle α formed by the joint detection point 2 and the joint detection point 3 and the target joint point o23', wherein, α12′=0.24π;α13′=0.12π;α23′=0.12π。
Step 209: combining current included angles formed by every two joint detection points and the target joint point into a current posture set;
that is, in this step, the current attitude set is composed to be {0.24 π, 0.12 π, 0.12 π }.
Step 210: calculating the matching degree of the lumbar vertebra part according to the standard included angle in the standard posture set and the current included angle in the current posture set;
when the target object such as a lumbar vertebra part is in different postures, the included angles between the two corresponding joint detection points and the target joint point are different, so that whether the posture of the target object is standard or not can be determined by matching the current included angle formed by every two joint detection points and the target joint point with the corresponding standard included angle. The step is mainly to calculate the matching degree of the current posture and the standard posture according to a formula (3);
ω = e - Σ | α i j - α i j ′ | N × 100 % - - - ( 3 )
wherein omega represents the degree of matching αijRepresenting a standard included angle formed by a joint detection point i and a joint detection point j in the standard posture array and a target joint point o αij' characterize the sum α in the current set of gesturesijCorresponding current angle, N represents the number of angles in the gesture array from equation (3), when two joint detection points are more matched, then | αijijThe smaller the' | is, the greater the degree of matching ω of the current posture with the standard posture can be obtained, and the closer the current posture is to the standard posture.
Step 211: judging whether the matching degree is smaller than a preset matching threshold value, if so, executing a step 212; otherwise, go to step 206;
in this step, the matching threshold may be set according to different requirements of the user, and when the matching value is greater than the preset matching threshold, it indicates that the posture of the lumbar vertebra portion is incorrect, and when the matching value is less than the preset matching threshold, it indicates that the posture of the lumbar vertebra portion is correct.
Step 212: feeding back correction information, starting a timer for timing, and executing step 206 when the timing reaches a preset timing threshold.
The feedback correction information in this step may be feedback correction information by means of message reminding such as vibration reminding or ring reminding, by starting a timer, when a timing threshold of the timer is reached, judging the current posture again, when the user adjusts the posture of the lumbar vertebra part to a standard posture, no reminding is performed, when the user still does not adjust the posture of the lumbar vertebra part to the standard posture, reminding is performed again, by setting the timer and the timing threshold, the reminding frequency can be reminded according to the timing threshold, for example: when the timing threshold is 5s, the process from the step 206 to the step 211 is executed again after 5s, when the posture is not corrected, the step 212 is continuously executed, because the process from the step 206 to the step 211 is completed in an intelligent mode, the time is very short and can be completed in tens of milliseconds, for the user, the posture of the lumbar part of the user is not corrected, the user receives correction information once every 5s, for example, the user receives vibration reminding correction once every 5s, and the like, so that the real-time reminding correction is realized.
The above embodiments are mainly directed to lumbar regions, and are also applicable to cervical vertebrae, shoulder joints, and the like, except for differences in the determined joint detection points.
As shown in fig. 4, an embodiment of the present invention provides a posture correction apparatus including: a determination unit 401, a gesture detection unit 402, a processing unit 403, and a feedback unit 404, wherein,
a determining unit 401, configured to determine at least two joint detection points, and determine a target joint point among the at least two joint detection points;
a pose detection unit 402, configured to determine a standard pose set, where the standard pose set includes a standard included angle formed by each two joint detection points determined by the determination unit 401 and a target joint point, and determines a current pose set of a target object, and the current pose set includes a current included angle formed by each two joint detection points determined by the determination unit 401 and the target joint point;
a processing unit 403, configured to calculate a matching degree of the target object according to the standard included angle in the standard gesture set determined by the gesture detection unit 402 and the current included angle in the current gesture set, and determine whether the matching degree is smaller than a preset matching threshold, if not, trigger the feedback unit 404;
a feedback unit 404, configured to feed back the correction information when receiving the trigger of the processing unit 403.
As shown in fig. 5, in another embodiment of the present invention, the gesture detection unit 402 includes:
a coordinate system setting subunit 501 configured to set a coordinate system;
a collecting subunit 502 that collects coordinates of each joint detection point in the target object in the coordinate system set by the coordinate system setting subunit 501 when the target object is in the standard posture;
the first calculating subunit 503 is configured to calculate a distance between every two joint detection points according to the standard coordinate of each joint detection point acquired by the acquiring subunit 502 in the coordinate system, and calculate a standard included angle formed by every two joint detection points in the target object and the target joint point according to the formula (1);
wherein, αijRepresenting included angles formed by the joint detection point i, the joint detection point j and the target joint point o;the distance between the joint detection point i and the target joint point o in the target object is represented,representing the distance between the joint detection point j and the target joint point o in the target object,and characterizing the distance between the joint detection point i and the target joint point j in the target object.
A constructing subunit 504, configured to combine the standard angles calculated by the first calculating subunit 503 into a standard posture set.
In yet another embodiment of the present invention, the acquiring subunit 502 is further configured to acquire current coordinates of each joint detection point in the target object in the coordinate system when the target object is in the current posture;
the first calculating subunit 503 is further configured to calculate a distance between every two joint detection points according to the current coordinate of each joint detection point acquired by the acquiring subunit 502, and calculate a current included angle formed by every two joint detection points in the target object and the target joint point according to the formula (1);
the constructing subunit 504 is further configured to combine the current angles calculated by the first calculating subunit 503 into a current posture set.
In another embodiment of the present invention, the first calculating subunit 503 is configured to calculate a distance between every two joint detection points according to formula (2);
wherein,representing a distance value x between a joint detection point i and a joint detection point j in the target objecti,yi,ziRespectively representing coordinate values of joint detection points i on an x axis, a y axis and a z axis, xj,yj,zjAnd respectively representing the coordinates of the joint detection point j on an x-axis y-axis and a z-axis.
As shown in fig. 6, in another embodiment of the present invention, the processing unit 403 includes:
a second calculating subunit 601, configured to calculate a matching degree of the target object according to formula (3);
ω = e - Σ | α i j - α i j ′ | N × 100 % - - - ( 3 )
wherein omega represents the degree of matching αijRepresenting a standard included angle formed by a joint detection point i and a joint detection point j in the standard posture array and a target joint point o αij' characterize the sum α in the current set of gesturesijAnd N represents the number of the included angles in the posture array.
In another embodiment of the present invention, as shown in fig. 7, the apparatus further comprises: the timer 701 may, among other things,
a feedback unit 404 for starting a timer 701;
a timer 701, configured to time, and when the time reaches a preset time threshold, trigger the acquisition subunit 502;
an acquisition subunit 502, further configured to execute, when receiving a trigger of the timer 701, acquiring current coordinates of each joint detection point in the target object in the coordinate system.
In another embodiment of the present invention, the gesture detection unit is integrated in the first device, and the determination unit, the processing unit and the feedback unit are integrated in the second device; wherein, first equipment is for fixing the equipment in target object department if can fixing the equipment at lumbar vertebrae position, cervical vertebra position, and the second equipment can be mobile terminal such as cell-phone, also can be the smart machine of bracelet, intelligent watch class, and this mobile terminal or smart machine pass through modes such as bluetooth and link to each other with first equipment, realize communication each other. For example: the method comprises the steps that at least two joint detection points and a standard posture set corresponding to the standard postures are determined on a bracelet, the joint detection points are provided for first equipment through Bluetooth, the first equipment detects coordinates of the joint detection points, calculates included angles between the two joint detection points and a target joint point, sends the calculated included angles to second equipment, the second equipment calculates matching degree through a processing unit, judges whether the matching degree is smaller than a preset matching threshold value or not, and accordingly judges whether the current posture is correct or not, and when the current posture is incorrect, the second equipment reminds a user to correct the posture through a feedback unit in a vibration mode.
In yet another embodiment of the present invention, the determination unit and the gesture detection unit are integrated in a first device, and the processing unit and the feedback unit are integrated in a second device; the first device is a device fixed at a target object, such as a device capable of being fixed at a lumbar vertebra and a cervical vertebra, that is, the first device includes a standard set corresponding to a standard posture that has been set when the first device leaves a factory, the second device is provided with the standard set in the first device and a current posture set corresponding to a current posture detected by the first device, and the second device determines whether the current posture is correct (i.e., a working process of the processing unit) according to the standard set provided by the first device and the current set corresponding to the current posture detected by the first device.
In another embodiment of the present invention, the determination unit, the posture detection unit, and the processing unit are integrated in a first device, and the feedback unit is integrated in a second device, that is, the second device, such as a mobile phone, a bracelet, a smart watch, and other smart devices, sends feedback information, such as vibration, to remind the user to correct the posture.
In another embodiment of the present invention, the determining unit, the posture detecting unit, the processing unit and the feedback unit are integrated in a first device, that is, the user only needs to place the first device on a target object part such as a cervical vertebra and a lumbar vertebra, so as to monitor joint detection points of the cervical vertebra and the lumbar vertebra, and simultaneously judge the correctness of the postures of the cervical vertebra and the lumbar vertebra, and when the posture of the detected part is incorrect, corrective information such as vibration is directly sent to remind the user.
Because the information interaction, execution process, and other contents between the units in the device are based on the same concept as the method embodiment of the present invention, specific contents may refer to the description in the method embodiment of the present invention, and are not described herein again.
According to the scheme, the embodiments of the invention have at least the following beneficial effects:
1. determining at least two joint detection points, and determining a target joint point in the at least two joint detection points; determining a standard posture set, wherein the standard posture set comprises standard included angles formed by every two joint detection points and a target joint point; determining a current posture set of the target object, wherein the current posture set comprises current included angles formed by every two joint detection points and the target joint point; calculating the matching degree of the target object according to the standard included angle in the standard posture set and the current included angle in the current posture set; and judging whether the matching degree is smaller than a preset matching threshold value or not, and if so, feeding back correction information. When a posture such as a sitting posture is determined, an included angle formed between two joint detection points on a target object such as a lumbar vertebra and a target joint point is fixed, namely once the current posture is different from a standard posture, the current included angle formed between the two joint detection points and the target joint point is different from the standard included angle, and then whether the current posture needs to be corrected or not can be accurately judged by the posture correction method without being influenced by the external shape of wearing equipment and the like, and the posture correction accuracy can be effectively improved.
2. According to the posture correction method provided by the invention, when the matching degree is judged to be smaller than the preset matching threshold, the current posture of the target object is incorrect, and the wearer is reminded to correct through feedback correction information, for example: when the target object is a lumbar vertebra part, the method provided by the embodiment of the invention judges that the posture of the lumbar vertebra part is incorrect, and when the user receives feedback information, the posture of the lumbar vertebra part is actively adjusted, so that the intelligent reminding of the wearer to actively correct the poor posture is realized.
3. The method comprises the steps of setting a timer, starting the timer for timing after the correction information is fed back, and determining a current posture set of a target object when the timing reaches a preset timing threshold, wherein the current posture set comprises a current distance value between every two joint detection points, namely, when the timing threshold is reached, if the current posture is not matched with the standard posture, the correction information is fed back again.
4. In the embodiment of the invention, the posture of the wearer is actively adjusted mainly by reminding the wearer without passively bounding target objects such as lumbar vertebra, cervical vertebra and the like, so that the freedom of movement of the wearer is ensured.
5. The posture correction device provided by the embodiment of the invention has practicability because the posture correction device does not passively restrict the behavior of the wearer, and the growth of the bones, muscles and the like of the wearer cannot be influenced even if the wearer wears the posture correction device for a long time.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a" does not exclude the presence of other similar elements in a process, method, article, or apparatus that comprises the element.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it is to be noted that: the above description is only a preferred embodiment of the present invention, and is only used to illustrate the technical solutions of the present invention, and not to limit the protection scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (10)

1. A method of correcting posture, comprising:
determining at least two joint detection points, and determining a target joint point in the at least two joint detection points;
determining a standard posture set, wherein the standard posture set comprises standard included angles formed by every two joint detection points and the target joint point;
determining a current posture set of a target object, wherein the current posture set comprises current included angles formed by every two joint detection points and the target joint point;
calculating the matching degree of the target object according to the standard included angle in the standard posture set and the current included angle in the current posture set;
and judging whether the matching degree is smaller than a preset matching threshold value or not, and if so, feeding back correction information.
2. The method of claim 1, wherein the determining a set of standard gestures comprises:
setting a coordinate system;
in a standard posture, acquiring a standard coordinate of each joint detection point in a target object in the coordinate system;
calculating the distance between every two joint detection points according to the standard coordinate of each joint detection point in the coordinate system;
calculating a standard included angle formed by every two joint detection points in the target object and the target joint point according to a first formula;
combining standard included angles formed by every two joint detection points and the target joint point into a standard posture set;
the first formula:
wherein, the αijRepresenting included angles formed by the joint detection point i, the joint detection point j and the target joint point o; the above-mentionedCharacterizing a distance between a joint detection point i and a target joint point o in a target object, saidCharacterizing a distance between a joint detection point j and a target joint point o in a target object, saidAnd characterizing the distance between the joint detection point i and the target joint point j in the target object.
3. The method of claim 2, wherein determining the current set of poses for the target object comprises:
under the current posture, acquiring the current coordinate of each joint detection point in the target object in the coordinate system;
calculating the distance between every two joint detection points according to the current coordinate of each joint detection point;
calculating the current included angle formed by every two joint detection points and the target joint point in the target object according to a first formula;
and combining current included angles formed by every two joint detection points and the target joint point into a current posture set.
4. The method according to claim 2 or 3,
the calculating the distance between every two joint detection points comprises the following steps:
calculating the distance between every two joint detection points according to a second formula;
the second formula:
wherein,representing a distance value x between a joint detection point i and a joint detection point j in the target objecti,yi,ziRespectively representing coordinate values of joint detection points i on an x axis, a y axis and a z axis, xj,yj,zjRespectively representing coordinates of joint detection points j on an x axis, a y axis and a z axis;
and/or the presence of a gas in the gas,
the calculating the matching degree of the target object comprises the following steps:
calculating the matching degree of the target object according to a third formula;
the third formula:
ω = e - Σ | α i j - α i j ′ | N × 100 %
wherein, the omega represents the matching degree, the αijRepresenting a standard included angle formed by a joint detection point i and a joint detection point j in the standard posture array and a target joint point o αij' characterize the sum α in the current set of gesturesijAnd N represents the number of the included angles in the posture array.
5. The method according to any one of claims 1 to 3,
further comprising: setting a timer;
after the feeding back the correction information, further comprising: starting the timer to time, and executing the current posture set of the determined target object when the time reaches a preset time threshold;
and/or the presence of a gas in the gas,
the target object, comprising: any one or more of cervical vertebra, lumbar vertebra and shoulder.
6. A posture improvement device, comprising: a determination unit, a posture detection unit, a processing unit and a feedback unit, wherein,
the determining unit is used for determining at least two joint detection points and determining a target joint point in the at least two joint detection points;
the gesture detection unit is configured to determine a standard gesture set, where the standard gesture set includes a standard included angle formed by every two joint detection points determined by the determination unit and the target joint point, and determines a current gesture set of the target object, and the current gesture set includes a current included angle formed by every two joint detection points determined by the determination unit and the target joint point;
the processing unit is used for calculating the matching degree of the target object according to the standard included angle in the standard posture set determined by the posture detection unit and the current included angle in the current posture set, judging whether the matching degree is smaller than a preset matching threshold value or not, and if not, triggering the feedback unit;
and the feedback unit is used for feeding back the correction information when the trigger of the processing unit is received.
7. The apparatus according to claim 6, wherein the gesture detection unit includes:
a coordinate system setting subunit, configured to set a coordinate system;
the acquisition subunit acquires coordinates of each joint detection point in the target object in the coordinate system set by the coordinate system setting subunit when the target object is in a standard posture;
the first calculating subunit is used for calculating the distance between every two joint detection points according to the standard coordinates of each joint detection point acquired by the acquiring subunit in the coordinate system, and calculating the standard included angle formed by every two joint detection points in the target object and the target joint point according to the following first formula;
the first formula:
wherein, the αijRepresenting included angles formed by the joint detection point i, the joint detection point j and the target joint point o; the above-mentionedCharacterizing a distance between a joint detection point i and a target joint point o in a target object, saidCharacterizing a distance between a joint detection point j and a target joint point o in a target object, saidAnd characterizing the distance between the joint detection point i and the target joint point j in the target object.
And the construction subunit is used for combining all the standard included angles obtained by the calculation of the first calculation subunit into a standard posture set.
8. The apparatus of claim 7,
the acquisition subunit is further configured to acquire a current coordinate of each joint detection point in the target object in the coordinate system when the target object is in the current posture;
the first calculating subunit is further configured to calculate a distance between every two joint detection points according to the current coordinate of each joint detection point acquired by the acquiring subunit, and calculate a current included angle formed by every two joint detection points in the target object and the target joint point according to a first formula;
the constructing subunit is further configured to combine the current included angles calculated by the first calculating subunit into a current posture set.
9. The apparatus of claim 8,
the first calculating subunit is used for calculating the distance between every two joint detection points according to a second formula;
the second formula:
wherein,representing a distance value x between a joint detection point i and a joint detection point j in the target objecti,yi,ziRespectively representing coordinate values of joint detection points i on an x axis, a y axis and a z axis, xj,yj,zjRespectively representing coordinates of joint detection points j on an x axis, a y axis and a z axis;
and/or the presence of a gas in the gas,
the processing unit includes:
the second calculating subunit is used for calculating the matching degree of the target object according to a third formula;
the third formula:
ω = e - Σ | α i j - α i j ′ | N × 100 %
wherein, the omega represents the matching degree, the αijRepresenting a standard included angle formed by a joint detection point i and a joint detection point j in the standard posture array and a target joint point o αij' characterize the sum α in the current set of gesturesijThe corresponding current included angle, and the N represents the number of included angles in the posture array;
and/or the presence of a gas in the gas,
further comprising: a timer, wherein the timer is, among other things,
the feedback unit is used for starting the timer;
the timer is used for timing, and when the timing reaches a preset timing threshold value, the acquisition subunit is triggered;
the acquisition subunit is further configured to, when receiving the trigger of the timer, perform acquisition of the current coordinates of each joint detection point in the target object in the coordinate system.
10. The apparatus according to any one of claims 6 to 9,
the gesture detection unit is integrated in a first device, and the determination unit, the processing unit and the feedback unit are integrated in a second device;
or,
the determination unit and the gesture detection unit are integrated in a first device, and the processing unit and the feedback unit are integrated in a second device;
or,
the determination unit, the gesture detection unit and the processing unit are integrated in a first device, and the feedback unit is integrated in a second device;
or,
the determining unit, the detecting unit, the processing unit and the feedback unit are integrated in a first device.
CN201610756271.9A 2016-08-30 2016-08-30 Posture correction method and device Pending CN106344030A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610756271.9A CN106344030A (en) 2016-08-30 2016-08-30 Posture correction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610756271.9A CN106344030A (en) 2016-08-30 2016-08-30 Posture correction method and device

Publications (1)

Publication Number Publication Date
CN106344030A true CN106344030A (en) 2017-01-25

Family

ID=57856965

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610756271.9A Pending CN106344030A (en) 2016-08-30 2016-08-30 Posture correction method and device

Country Status (1)

Country Link
CN (1) CN106344030A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108510717A (en) * 2018-05-08 2018-09-07 孙皓楠 A kind of object wearing device and its working method for sitting posture detection correction
CN109740543A (en) * 2019-01-07 2019-05-10 深圳前海默比优斯科技有限公司 A kind of the user's specific behavior analysis method and self-medicine terminal of view-based access control model
CN109753891A (en) * 2018-12-19 2019-05-14 山东师范大学 Soccer player posture calibration method and system based on human key point detection
CN110569775A (en) * 2019-08-30 2019-12-13 武汉纺织大学 A method, system, storage medium and electronic device for recognizing human body posture
CN111443809A (en) * 2020-03-30 2020-07-24 南方科技大学 Neck posture detection method and device, terminal and storage medium
CN112435731A (en) * 2020-12-16 2021-03-02 成都翡铭科技有限公司 Method for judging whether real-time posture meets preset rules
CN112949587A (en) * 2021-03-31 2021-06-11 上海电机学院 Key point-based hand holding posture correction method and system and computer readable medium
CN113761989A (en) * 2020-06-05 2021-12-07 腾讯科技(深圳)有限公司 Behavior monitoring method and device, computer and readable storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060268986A1 (en) * 2005-05-09 2006-11-30 Commissariat A L'energie Atomique Process for estimating the motion phase of an object
US20070186429A1 (en) * 2004-03-30 2007-08-16 Commissariat A L'energie Atomique Method and Device for Determining a Person's Motions
CN101996311A (en) * 2009-08-10 2011-03-30 深圳泰山在线科技有限公司 Yoga stance recognition method and system
CN103076045A (en) * 2011-10-25 2013-05-01 上海新世纪机器人有限公司 Head posture sensing device and method
CN103605965A (en) * 2013-11-25 2014-02-26 苏州大学 Multi-pose face recognition method and device
CN103927010A (en) * 2014-04-16 2014-07-16 北京尚德智产投资管理有限公司 Mobile terminal achieving user posture detection through monitoring program and method
CN103927250A (en) * 2014-04-16 2014-07-16 北京尚德智产投资管理有限公司 User posture detecting method achieved through terminal device
CN103955272A (en) * 2014-04-16 2014-07-30 北京尚德智产投资管理有限公司 Terminal equipment user posture detecting system
CN104157107A (en) * 2014-07-24 2014-11-19 燕山大学 Human body posture correction device based on Kinect sensor
CN105043383A (en) * 2015-07-10 2015-11-11 哈尔滨医科大学 Posture correction method and apparatus
CN105307017A (en) * 2015-11-03 2016-02-03 Tcl集团股份有限公司 Method and device for correcting posture of smart television user
CN105662785A (en) * 2016-04-08 2016-06-15 中国石油大学(北京) Arm correction device and method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070186429A1 (en) * 2004-03-30 2007-08-16 Commissariat A L'energie Atomique Method and Device for Determining a Person's Motions
US20060268986A1 (en) * 2005-05-09 2006-11-30 Commissariat A L'energie Atomique Process for estimating the motion phase of an object
CN101996311A (en) * 2009-08-10 2011-03-30 深圳泰山在线科技有限公司 Yoga stance recognition method and system
CN103076045A (en) * 2011-10-25 2013-05-01 上海新世纪机器人有限公司 Head posture sensing device and method
CN103605965A (en) * 2013-11-25 2014-02-26 苏州大学 Multi-pose face recognition method and device
CN103927010A (en) * 2014-04-16 2014-07-16 北京尚德智产投资管理有限公司 Mobile terminal achieving user posture detection through monitoring program and method
CN103927250A (en) * 2014-04-16 2014-07-16 北京尚德智产投资管理有限公司 User posture detecting method achieved through terminal device
CN103955272A (en) * 2014-04-16 2014-07-30 北京尚德智产投资管理有限公司 Terminal equipment user posture detecting system
CN104157107A (en) * 2014-07-24 2014-11-19 燕山大学 Human body posture correction device based on Kinect sensor
CN105043383A (en) * 2015-07-10 2015-11-11 哈尔滨医科大学 Posture correction method and apparatus
CN105307017A (en) * 2015-11-03 2016-02-03 Tcl集团股份有限公司 Method and device for correcting posture of smart television user
CN105662785A (en) * 2016-04-08 2016-06-15 中国石油大学(北京) Arm correction device and method

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108510717A (en) * 2018-05-08 2018-09-07 孙皓楠 A kind of object wearing device and its working method for sitting posture detection correction
CN108510717B (en) * 2018-05-08 2020-04-24 孙皓楠 Wearable device for sitting posture detection and correction and working method thereof
CN109753891A (en) * 2018-12-19 2019-05-14 山东师范大学 Soccer player posture calibration method and system based on human key point detection
CN109740543A (en) * 2019-01-07 2019-05-10 深圳前海默比优斯科技有限公司 A kind of the user's specific behavior analysis method and self-medicine terminal of view-based access control model
CN110569775A (en) * 2019-08-30 2019-12-13 武汉纺织大学 A method, system, storage medium and electronic device for recognizing human body posture
CN111443809A (en) * 2020-03-30 2020-07-24 南方科技大学 Neck posture detection method and device, terminal and storage medium
CN111443809B (en) * 2020-03-30 2023-06-16 南方科技大学 Neck gesture detection method and device, terminal and storage medium
CN113761989A (en) * 2020-06-05 2021-12-07 腾讯科技(深圳)有限公司 Behavior monitoring method and device, computer and readable storage medium
CN112435731A (en) * 2020-12-16 2021-03-02 成都翡铭科技有限公司 Method for judging whether real-time posture meets preset rules
CN112435731B (en) * 2020-12-16 2024-03-19 成都翡铭科技有限公司 Method for judging whether real-time gesture meets preset rules
CN112949587A (en) * 2021-03-31 2021-06-11 上海电机学院 Key point-based hand holding posture correction method and system and computer readable medium

Similar Documents

Publication Publication Date Title
CN106344030A (en) Posture correction method and device
CN106372294A (en) Method and device for correcting posture
US12026309B2 (en) Interactive motion-based eye tracking calibration
CN110495889B (en) Posture evaluation method, electronic device, computer device, and storage medium
KR101868597B1 (en) Apparatus and method for assisting in positioning user`s posture
KR101582347B1 (en) Personal Health Training Services Method and Systems
US20180178061A1 (en) Rehabilitation compliance devices
US20190053738A1 (en) Method for monitoring user gesture of wearable device
Yean et al. Smartphone orientation estimation algorithm combining Kalman filter with gradient descent
WO2018085806A1 (en) System and method for activity monitoring eyewear and head apparel
CN104239860A (en) Sitting posture detection and reminding method and device during use of intelligent terminal
CN106999101A (en) Apparatus and method for detecting human posture
Wei et al. Real-time 3D arm motion tracking using the 6-axis IMU sensor of a smartwatch
CN107273823A (en) A kind of neck attitude monitoring method merged based on sensor with image procossing
Severin Head posture monitor based on 3 IMU sensors: Consideration toward healthcare application
US9706962B1 (en) Apparatus and method for teaching and algorithms for identifying qualifying movements
CN112070031A (en) Posture detection method, device and equipment
CN110059670A (en) Human body Head And Face, limb activity angle and body appearance non-contact measurement method and equipment
KR101498498B1 (en) Method for Postural Correction Using Skeleton Tracking
CN114724207A (en) Posture monitoring method, device, equipment and computer readable storage medium
KR20160054208A (en) Clothes attachable type posture correction system
Allen et al. Evaluation of fall risk for post-stroke patients using bluetooth low-energy wireless sensor
CN111047832A (en) Mobile equipment with sitting posture adjusting function and using method thereof
KR20190018265A (en) Device, system and method for posture monitoring
Basmaji et al. Posture detection framework using the internet of wearable things

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170125

RJ01 Rejection of invention patent application after publication