WO2012039467A1 - 運動支援システム - Google Patents
運動支援システム Download PDFInfo
- Publication number
- WO2012039467A1 WO2012039467A1 PCT/JP2011/071667 JP2011071667W WO2012039467A1 WO 2012039467 A1 WO2012039467 A1 WO 2012039467A1 JP 2011071667 W JP2011071667 W JP 2011071667W WO 2012039467 A1 WO2012039467 A1 WO 2012039467A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- unit
- image
- evaluation
- support system
- Prior art date
Links
- 238000000605 extraction Methods 0.000 claims abstract description 126
- 238000011156 evaluation Methods 0.000 claims description 310
- 230000005484 gravity Effects 0.000 claims description 84
- 238000003860 storage Methods 0.000 claims description 59
- 238000003384 imaging method Methods 0.000 claims description 40
- 210000001364 upper extremity Anatomy 0.000 claims description 37
- 238000001514 detection method Methods 0.000 claims description 36
- 230000008859 change Effects 0.000 claims description 35
- 238000012545 processing Methods 0.000 claims description 35
- 238000004364 calculation method Methods 0.000 claims description 24
- 210000003414 extremity Anatomy 0.000 claims description 23
- 230000002123 temporal effect Effects 0.000 claims description 19
- 238000009826 distribution Methods 0.000 claims description 15
- 230000000052 comparative effect Effects 0.000 claims description 13
- 238000013500 data storage Methods 0.000 claims description 13
- 238000001454 recorded image Methods 0.000 claims description 8
- 230000009471 action Effects 0.000 claims description 2
- 230000036544 posture Effects 0.000 description 103
- 238000012549 training Methods 0.000 description 74
- 210000002414 leg Anatomy 0.000 description 44
- 238000000034 method Methods 0.000 description 31
- 238000005259 measurement Methods 0.000 description 20
- 230000006870 function Effects 0.000 description 18
- 230000008569 process Effects 0.000 description 18
- 230000002441 reversible effect Effects 0.000 description 17
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 13
- 239000000284 extract Substances 0.000 description 13
- 210000003128 head Anatomy 0.000 description 12
- 210000002683 foot Anatomy 0.000 description 11
- 230000008901 benefit Effects 0.000 description 10
- 230000000694 effects Effects 0.000 description 10
- 210000003127 knee Anatomy 0.000 description 9
- 208000027418 Wounds and injury Diseases 0.000 description 7
- 238000012937 correction Methods 0.000 description 7
- 230000006378 damage Effects 0.000 description 7
- 208000014674 injury Diseases 0.000 description 7
- 238000006243 chemical reaction Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 210000002310 elbow joint Anatomy 0.000 description 5
- 230000009172 bursting Effects 0.000 description 4
- 210000000245 forearm Anatomy 0.000 description 4
- 210000003141 lower extremity Anatomy 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 230000000737 periodic effect Effects 0.000 description 4
- 210000000707 wrist Anatomy 0.000 description 4
- 210000003423 ankle Anatomy 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 210000001503 joint Anatomy 0.000 description 3
- 210000000323 shoulder joint Anatomy 0.000 description 3
- 238000002834 transmittance Methods 0.000 description 3
- 230000001186 cumulative effect Effects 0.000 description 2
- 238000009792 diffusion process Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000011084 recovery Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 238000002366 time-of-flight method Methods 0.000 description 2
- 0 C**(C)(C)NC Chemical compound C**(C)(C)NC 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 208000016285 Movement disease Diseases 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 230000007659 motor function Effects 0.000 description 1
- 239000012466 permeate Substances 0.000 description 1
- 229920003217 poly(methylsilsesquioxane) Polymers 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
- 229920003002 synthetic resin Polymers 0.000 description 1
- 239000000057 synthetic resin Substances 0.000 description 1
- 239000012780 transparent material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1036—Measuring load distribution, e.g. podologic studies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1114—Tracking parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
- A61B5/1122—Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/744—Displaying an avatar, e.g. an animated cartoon character
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
- G09B19/0038—Sports
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/09—Rehabilitation or training
Definitions
- the present invention relates to an exercise support system that supports user exercise.
- a video showing a sample posture is projected on a monitor and the body is moved in accordance with the video.
- the present invention has been made in view of the above reasons, and allows the user to visually recognize his / her posture and how much the difference between his / her posture and the posture in the displayed image is. It is an object to provide an exercise support system that can be used.
- a first form of the exercise support system of the present invention includes a display device having a display surface for displaying an image to a user, a comparison image storage unit that stores a comparison image that is an image of an exerciser performing a predetermined exercise, A comparison image display unit that displays the comparison image stored in the comparison image storage unit on the display surface, a mirror image display unit that displays a mirror image of the user over the comparison image, and a body image of the user
- a feature amount extraction unit that detects a position of a predetermined sample point and obtains a feature amount representing the posture of the user based on the position of the sample point; and the feature amount obtained by the feature amount extraction unit
- a posture evaluation unit that evaluates a deviation of the posture of the user from the posture of the exerciser, and a presentation unit that presents a result of the evaluation by the posture evaluation unit, Is provided.
- the mirror image display means is a half mirror disposed on the front surface of the display device.
- the mirror image display means is generated by an imaging device that shoots the user and generates an image of the user, and the imaging device.
- a reversal processing unit that generates a horizontally reversed image by horizontally reversing the user image, and a reversed image display unit that displays the horizontally reversed image generated by the reversal processing unit on the display surface.
- the predetermined exercise is an exercise that serves as a model for the exercise performed by the user.
- the user performs the predetermined exercise by photographing the user who performs the predetermined exercise.
- An imaging apparatus that generates a recorded image that is a user image and a reference amount extraction unit are provided.
- the comparative image display unit is configured to display the recorded image generated by the imaging device on the display surface as the comparative image.
- the reference amount extraction unit detects a position of the predetermined sample point on the user's body from the recorded image, and obtains a feature amount representing the posture of the user as the reference amount based on the position of the sample point. Configured as follows.
- the feature amount extraction unit detects the positions of the plurality of sample points in the user's body.
- a body model indicating the user's body is created based on the positions of the plurality of sample points, and the feature amount is obtained based on the body model.
- the feature amount is a line connecting two sample points selected from the plurality of sample points and a predetermined reference line. Is the angle between.
- the feature amount is an inclination of the user's body.
- the inclination of the user's body is the inclination of the user's shoulder
- the feature amount extraction unit is connected to the right side of the user.
- An upper limb sample point and a left upper limb sample point are detected, and an angle between a straight line connecting the sample point of the upper right limb and the sample point of the left upper limb and a horizontal line is determined as the inclination of the shoulder of the user It is comprised so that it may calculate as.
- the inclination of the user's body is an inclination of the trunk of the user
- the feature amount extraction unit Detecting the sample point of the head and the sample point of the waist, and the angle between a straight line connecting the sample point of the head and the sample point of the waist and a vertical line is determined on the trunk of the user It is comprised so that it may calculate as inclination.
- the feature amount is a range of motion of a specific part of the user's body.
- the specific part is the upper limb of the user
- the feature amount extraction unit includes the sample point of the upper limb of the user and the Detecting a shoulder sample point corresponding to the upper limb, and calculating an angle between a straight line connecting the sample point of the upper limb and the sample point of the shoulder and a vertical line as a movable range of the upper limb Is done.
- the number of the sample points is at least three, and the feature amount is an area defined by the plurality of the sample points. It is an area.
- the comparison image display unit is configured so that the comparison image overlaps a mirror image of the user. The position and size of the comparison image on the display surface are adjusted.
- the posture evaluation unit is configured to obtain a numerical value indicating a difference between the feature quantity and the reference quantity.
- the presenting unit is configured to present the numerical value obtained by the posture evaluating unit.
- a position detection unit that detects a position of a specific part of the user's body and the position detected by the position detection unit.
- An index setting unit that determines a position of the index on the display surface; a determination unit that determines whether the index is located at a predetermined position on the display surface; and the determination unit is positioned at the predetermined position.
- an event image display unit that displays a predetermined event image at the predetermined position, and evaluation data indicating a movable range of the specific part are created based on the position detected by the position detection unit
- a movable range evaluating unit that evaluates the movable range of the specific part based on a comparison between the evaluation data created by the evaluation data creating unit and reference data.
- the presenting unit is configured to present a result of the evaluation by the range of motion evaluation unit.
- the range of motion evaluation unit employs the evaluation data used for the previous evaluation of the range of motion of the specific part as the reference data. Configured to do.
- the reference data is data indicating a range of motion of the standard specific part of a healthy person.
- the evaluation data is an area of the region through which the specific part has passed in a plane parallel to the display surface. Includes data showing area.
- the evaluation data includes data indicating a movable range of the specific part in a predetermined direction.
- the evaluation data is required for the user to perform a predetermined operation at the specific part. Contains data indicating time.
- the position detector detects a position of the specific part from an output of a three-dimensional sensor.
- the evaluation data includes data indicating a volume of a region through which the specific part has passed.
- the evaluation data includes data indicating a locus of the specific part.
- the mirror image display means is a half mirror disposed on the front surface of the display device
- the index setting unit is configured to determine the position of the index so that the position of the index corresponds to the position of the display surface that overlaps the specific part reflected on the half mirror.
- the mirror image display means captures the user and generates an image of the user.
- An image processing apparatus an inversion processing unit that inverts the user's image generated by the imaging device to generate a horizontally inverted image, and the horizontally inverted image generated by the inversion processing unit is displayed on the display surface.
- a reverse image display unit is configured to determine the position of the index so that the position of the index corresponds to the position of the specific part in the horizontally reversed image.
- a measuring device having a working surface that receives a load from the user and measuring the distribution of the load on the working surface, and the measuring device.
- a calculation unit that calculates a balance value representing a ratio of a load at a predetermined location in the working surface based on the measured distribution of the load; and a balance value storage unit that stores the balance value calculated by the calculation unit;
- a balance value display unit that displays the balance value calculated by the calculation unit on the display surface, a setting data storage unit that stores setting data indicating a temporal change in the target value of the balance value, and the setting data
- a target value display unit that displays the target value on the display surface based on the setting data stored in the storage unit, and the variance from the balance value stored in the balance value storage unit.
- a centroid movement evaluation unit that evaluates the centroid movement of the user based on the temporal change of the balance value and the temporal change of the target value indicated by the setting data.
- the presenting unit is configured to present the result of the evaluation by the gravity center movement evaluating unit.
- the gravity center movement evaluation unit evaluates the gravity center movement using a difference between the balance value and the target value at a predetermined time point. Configured as follows.
- the setting data stored in the setting data storage unit according to a result of the evaluation by the gravity center movement evaluation unit indicates A setting update unit for changing the temporal change of the target value is provided.
- the twenty-ninth form of the exercise support system of the present invention is any one of the twenty-sixth to the twenty-eighth aspects, wherein the mirror image display means is a half mirror disposed in front of the display surface.
- FIG. 1 is a schematic diagram illustrating a system configuration of an exercise support system according to a first embodiment. It is explanatory drawing of operation
- FIG. It is explanatory drawing which shows the production
- FIG. It is explanatory drawing which shows the production
- FIG. It is explanatory drawing which shows the shift
- FIG. It is explanatory drawing which shows the shift
- FIG. It is explanatory drawing of the calculation method of the inclination of the shoulder in the exercise
- FIG. It is explanatory drawing of the calculation method of the inclination of the trunk in the exercise assistance system of the said Embodiment 1.
- FIG. It is explanatory drawing of the calculation method of the range of motion of the upper limb in the exercise assistance system of the said Embodiment 1.
- FIG. It is the schematic which shows the system configuration
- the exercise support system 1 of the present embodiment is used for rehabilitation aimed at restoring the function of the limbs by causing a patient whose limbs remain damaged due to illness or injury to perform predetermined exercises.
- the description of the following embodiment is not intended to limit the use of the exercise support system, and for example, a healthy person may use the exercise support system for exercise such as yoga or dance.
- a healthy person may use the exercise support system for exercise such as yoga or dance.
- an example in which the exercise support system is used in a posture in which the user stands is shown, but the present invention is not limited to this, and the user may use the exercise support system in a posture sitting on a chair or the like.
- the exercise support system 1 generates a distance image and a display device 3 that displays an image on a display surface 30 that is arranged in front of a user (patient) 2 and faces the user 2.
- a distance image sensor 4 and a control device 5 for controlling the operation of the display device 3 and the like are provided. Both the display device 3 and the distance image sensor 4 are connected to the control device 5.
- the exercise support system 1 further includes a half mirror 6 disposed in front of the display surface 30 of the display device 3 (on the user 2 side).
- the half mirror 6 is vertically arranged between the display device 3 and the user 2 so that the front surface (mirror surface) of the half mirror faces the user 2, and the image displayed on the display device 3 behind is displayed. Permeate to the user 2 side.
- the display device 3 has a display surface 30 for displaying an image to the user 2.
- the display device 3 is a plasma display here, and is attached to the back side of the half mirror 6.
- the structure for supporting the half mirror 6 and the mounting structure for the display device 3 are not shown.
- the half mirror 6 and the display device 3 are positioned in a fixed position with sufficient strength. Fixed.
- the display device 3 is not limited to a plasma display, and may be another display device such as a liquid crystal display.
- a diffusion sheet (not shown) attached to the back surface of the half mirror 6 and a projection device that projects an image on the diffusion sheet from the back of the half mirror 6 (the side opposite to the user 2). It is also conceivable to use a display device (not shown).
- the half mirror 6 has a vertically long rectangular front surface and is formed in a size that functions as an appearance of the user 2.
- the transmittance of the half mirror 6 is designed so that the half mirror 6 can be used as a mirror and the user 2 can visually recognize the image displayed on the display device 3 through the half mirror 6.
- the half mirror 6 is formed by applying a mirror surface coating with a metal film or the like on at least one surface of a transparent base material made of glass or synthetic resin.
- the display device 3 is disposed so that the display surface 30 is in contact with the back surface of the half mirror 6.
- the height position of the display device 3 is determined such that the lower end edge is located at a predetermined interval from the lower end of the half mirror 6 and the upper end edge is located at a predetermined interval from the upper end of the half mirror 6. Yes.
- the display device 3 is disposed slightly above the center of the half mirror 6.
- a transparent material that adjusts the refractive index and prevents reflection is provided between the half mirror 6 and the display surface 30 so that the image displayed on the display device 3 can be displayed on the front surface of the half mirror 6 with high luminance. It may be filled.
- the front surface of the half mirror 6 functions to display a mirror image of the user 2 as a mirror and to display an image displayed on the display surface 30 of the display device 3. That is, if the user 2 is in front of the half mirror 6, a mirror image of the user 2 is reflected on the front surface of the half mirror 6, and an image displayed on the display device 3 passes through the half mirror 6 and passes through the half mirror 6. It will be projected on the front. As will be described in detail later, the video displayed on the display device 3 is generated by the control device 5.
- the distance image sensor 4 generates a distance image in which the pixel value is a distance value based on the principle of the time-of-flight method using intensity-modulated light.
- the distance image sensor 4 may be configured to generate a distance image, and is not limited to the distance image sensor using the time-of-flight method.
- the distance image sensor 4 detects the distance to the detection target existing in the sensing area, and detects the position of the detection target in the three-dimensional space as a distance image.
- the distance image sensor 4 is disposed above the display device 3 so as not to overlap the display device 3 and generates a distance image of the user 2 in front of the half mirror 6.
- the distance image sensor 4 is positioned at the approximate center in the left-right direction above the display device 3, and the vertical direction (tilt angle) is determined so that the user 2 is looked down from diagonally upward and forward.
- the distance image sensor 4 includes the whole body of the user 2 in the field of view, and the center line in the left-right direction of the body of the user 2 in a state where the user 2 is upright is the center line in the left-right direction of the distance image.
- the horizontal direction (pan angle) is adjusted to match
- the distance image sensor 4 is not limited to the above-described arrangement, and may be installed at the level of the eyes of the user 2 on the front side (user 2 side) of the half mirror 6 using, for example, a camera stand or the like. .
- the adjustment of the position and orientation of the distance image sensor 4 as described above is performed as an initial setting after the position and orientation of the user 2 are determined.
- the distance image sensor 4 generates a moving image of the distance image that reflects the whole body of the user 2.
- the exercise support system 1 compares the comparison target image that performs the same exercise as the exercise to be performed by the user 2 as a comparison image and the display device 3 when the user 2 exercises.
- the control device 5 includes a display control unit 52 that displays video.
- the exercise support system 1 of the present embodiment includes a comparison image storage unit (storage unit) 51 that stores a comparison image (comparison video) that is an image of an exerciser performing a predetermined exercise, and a comparison image storage unit 51.
- a comparison image display unit (display control unit) 52 that displays the comparison image stored in the display screen 30.
- a comparison target image indicating a model of exercise to be performed by the user 2 is used as a comparison image.
- the predetermined exercise is an exercise that serves as an example of the exercise performed by the user 2.
- an instructor as a comparison target actually performs the exercise to be performed by the user 2 is imaged from the front of the instructor so that the entire body of the instructor can enter with an imaging device (not shown).
- the moving image obtained by this is used as a comparative video.
- the comparison video obtained in this way is stored in advance in the storage unit 51.
- the comparison image may be a still image.
- the comparison image shows the posture of the exerciser that the user 2 should use as a model. That is, the comparison image may be a still image of an exerciser who performs a predetermined exercise, or a moving image of an exerciser who performs a predetermined exercise.
- the display control unit 52 adjusts the size of the comparison image on the display surface 30 so that the mirror image reflected on the half mirror 6 when viewed from the user 2 overlaps the comparison target in the comparison image (see FIG. (Not shown) and a position adjusting unit (not shown) for adjusting the position. That is, the display control unit (comparison image display unit) 52 is configured to adjust the position and size of the comparison image on the display surface 30 so that the comparison image overlaps the mirror image of the user 2.
- the size adjusting unit sets the comparison target of the comparison video to one half of the real size.
- the display size is adjusted so that the size is displayed on the display surface 30 of the display device 3.
- the size adjustment unit adjusts the comparison video so that the size of the comparison target matches the mirror image of the user 2.
- the display size is adjusted.
- the position adjustment unit has an alignment function that can freely adjust the display position of the comparison video on the display surface 30, and the mirror image of itself and the image of the comparison target (instructor) are seen from the viewpoint of the user 2.
- the display position of the comparison video is adjusted so that the comparison video is displayed at the overlapping position.
- the position of the comparison video on the display surface 30 is adjusted according to the positional relationship between the half mirror 6 and the user 2.
- the positional relationship between the half mirror 6 and the user 2 may be determined in advance so that the user 2 stands at the front center of the half mirror 6.
- a pressure sensor (not shown) that detects the position of the center of gravity of the user 2 may be added, and the positional relationship between the half mirror 6 and the user 2 may be obtained from the detection result of the pressure sensor.
- the control device 5 includes a height registration unit (not shown) in which a uniform value (for example, 170 cm) is registered in advance as the height of the user 2.
- the height registered in the height registration unit may be directly input by the user 2.
- the display control unit 52 displays the comparison video at a position based on the height of the user 2 so that it can be viewed from the viewpoint of the user 2. It is possible to display the comparison object at a position overlapping with its own mirror image.
- the half mirror 6 can be obtained from the position and orientation of the distance image sensor 4 and the standing position and the top position of the user 2 detected from the distance image.
- the position of a specific part (for example, the head) of the mirror image shown in FIG. Therefore, the display control unit 52 can display the comparison target at a position overlapping the mirror image of the user 2 from the viewpoint of the user 2.
- the adjustment of the display size and display position of the comparison video may be performed manually at the initial setting or may be automated.
- a mirror image 20 of the user 2 is shown on the front surface of the half mirror 6 as shown in FIG. 2, and a video (comparison video) of a comparison target (instructor) 31 showing a sample posture is shown in the half mirror 6.
- the half mirror 6 disposed in front of the display device 3 functions as a mirror image display unit that displays a mirror image of the user 2 on the comparison image. Therefore, the user 2 can exercise while comparing his / her mirror image 20 reflected on the half mirror 6 with the comparison target 31 in the comparison video displayed on the display surface 30.
- the mirror image 20 of the user 2 is indicated by a broken line
- the comparison target 31 in the comparison video displayed on the display surface 30 is indicated by a solid line.
- the comparison video is displayed so that the mirror image 20 of the user 2 and the comparison target 31 in the comparison video overlap, the user 2 can easily grasp how to move the body from the comparison video.
- the image of the comparison target 31 falls within the field of view of the user 2, so that the user 2 is in the comparison image with the mirror image 20 of the user 2 even if the user hardly moves the line of sight.
- the comparison object 31 can be compared.
- the right half of the comparison target 31 is displayed on the right half of the user 2 and the left half of the comparison target 31 is displayed on the mirror image of the left half of the user 2, the user 2 performs the operation of the comparison target 31. It is easy to distinguish between right and left, and it is easy to grasp how to move the body.
- the display device 3 of the display device 3 is viewed so as not to cause a large difference in the appearance of the mirror image 20 reflected on the half mirror 6 and the comparison image displayed on the display device 3 when viewed from the user 2. It is desirable that brightness and indoor brightness are adjusted appropriately.
- a moving image obtained by capturing the comparison target (instructor) from a direction other than the front (for example, from the side) is further stored as a reference image in the storage unit 51, and the display control unit 52 displays the reference image together with the comparison image on the display device 3. It may be configured to be displayed. In this case, for example, it is desirable that the reference video is displayed on the side of the comparative video on the display surface 30 so as not to overlap the comparative video. Since the reference video is displayed on the display device 3, the user 2 can refer to the reference video to understand the body movement in the front-rear direction, for example, so that the user 2 matches the comparison target 31 in the comparison video. Makes it easier to move the body.
- the display control unit 52 is not limited to the configuration in which the comparison target 31 in the comparison video is displayed at a position overlapping the mirror image 20 of the user 2, but on the display surface 30 sideward or above or below the mirror image 20 of the user 2.
- the comparison target 31 may be displayed at a position where Even when the comparison target 31 is displayed at a position overlapping the mirror image 20 of the user 2, the mirror image 20 of the user 2 and the comparison target 31 in the comparison video do not need to be completely overlapped. It suffices if the parts to be overlapped.
- the distance image of the user 2 generated by the distance image sensor 4 is output to the control device 5 and used for processing for recognizing the posture of the user 2.
- the control device 5 includes a computer, and acquires an acquisition unit 53 that acquires a distance image from the distance image sensor 4 and extracts a first feature amount that represents the posture of the user 2 using the acquired distance image.
- a first extraction unit (feature amount extraction unit) 54 Further, the control device 5 evaluates a posture deviation between the user 2 and the comparison target 31, and a second extraction unit (reference amount extraction unit) 55 that extracts a second feature amount representing the posture of the comparison target 31.
- An evaluation unit (posture evaluation unit) 56 and a presentation unit 57 that presents the evaluation result of the evaluation unit 56 are provided.
- the first extraction unit 54 detects the position of the specific point of the user 2 in the distance image by the image recognition technique, and extracts a first feature amount representing the posture of the user 2 based on the position of the specific point. To do. That is, the first extraction unit (feature amount extraction unit) 54 detects the position of a predetermined sample point (specific point) 22 on the body of the user 2 and determines the posture of the user 2 based on the position of the sample point 22. A feature amount (first feature amount) to be expressed is configured to be obtained.
- a case where a plurality of points respectively set at positions such as the center line of the trunk of the user 2, the top of the head, the shoulder, the elbow, the hand, the waist, the knee, and the ankle are used as specific points.
- the specific point should just be set to specific positions, such as the user's 2 fingertip in a distance image.
- the first extraction unit 54 extracts a first feature amount representing the posture of the user 2 based on the position of the specific point detected in this way.
- the second extraction unit 55 detects the position of the specific point of the comparison target 31, and extracts a second feature amount representing the posture of the comparison target 31 based on the position of the specific point. That is, the second extraction unit (reference amount extraction unit) 55 detects the position of a predetermined sample point (specific point) 33 on the body of the comparison target (exercise person) 31, and moves the athlete based on the position of the sample point 33. A reference amount (second feature amount) representing 31 postures is obtained. The position of the specific point of the comparison target 31 is detected from the distance image obtained by the distance image sensor 4 when the comparison video is captured.
- the evaluation unit 56 compares the first feature value and the second feature value to evaluate the posture deviation between the user 2 and the comparison target 31. That is, the evaluation unit (posture evaluation unit) 56 uses the feature amount (first feature amount) obtained by the first extraction unit (feature amount extraction unit) 54 as a reference amount (the comparison target) representing the posture of the athlete (comparison target). Compared to the second feature amount), the user 2 is configured to evaluate the deviation of the posture of the user from the posture of the exerciser.
- the presenting unit 57 presents to the user 2 the posture deviation between the user 2 and the comparison target 31 evaluated by the evaluation unit 56. That is, the presentation unit 57 is configured to present the result of evaluation by the evaluation unit (posture evaluation unit) 56. Specifically, the presentation unit 57 presents the evaluation result of the evaluation unit 56 to the user 2 by voice or light. Alternatively, the evaluation result of the evaluation unit 56 may be displayed on the display device 3 according to an instruction from the display control unit 52, and the display device 3 may also be used as the presentation unit 57.
- the first extraction unit 54 detects the positions of a plurality of specific points 22 from the entire body of the user 2 as shown in FIGS.
- a body model (hereinafter referred to as “first body model”) 21 representing the posture of the user 2 in the space is generated. That is, in the 1st extraction part 54, the 1st body model 21 as shown in FIG. 4 is produced
- a (first body model) 21 is created, and a feature amount (first feature amount) is obtained based on the body model 21.
- the plurality of sample points 22 on the body of the user 2 are the sample point 22 (22a) on the head of the user 2, the sample point 22 (22b) on the right shoulder of the user 2, and the user.
- the first extraction unit 54 generates the first body model 21 based on the twelve sample points 22a-22l (see FIG. 4).
- the distance image of the user 2 generated by the distance image sensor 4 is a moving image that changes as the user 2 moves
- the first body model 21 generated from each frame of the distance image is The movement of the user 2 is represented in real time.
- the distance image sensor 4 is installed above the user 2 as shown in FIG. Therefore, the image (distance image) of the user 2 obtained from the distance image sensor 4 is an image obtained by photographing the user 2 from obliquely upward. That is, the distance image is an image distorted by the user 2. For this reason, an error may occur in the feature amount due to the depression angle of the camera of the distance image sensor 4. Therefore, the first extraction unit 54 corrects the distance image obtained from the distance image sensor 4 as a distance image obtained by photographing the user 2 from the front as preprocessing for detecting the sample point (specific point) 22. It is comprised so that the correction process to perform may be performed. Such correction processing can be performed by converting the coordinates of the image using a matrix obtained in advance.
- the first extraction unit 54 After performing the correction process, the first extraction unit 54 detects the position of the sample point 22 from the distance image. When the error due to the depression angle is small, the correction process may be omitted.
- the distance image sensor 4 may be installed below the eye level of the user 2. For example, the distance image sensor 4 may be installed at the center of the lower end of the front surface of the half mirror 6. Also in this case, the first extraction unit 54 is configured to perform a correction process for correcting the distance image obtained from the distance image sensor 4 into a distance image obtained by photographing the user 2 from the front. Further, the distance image sensor 4 may be installed on the right side or the left side of the user 2, and in this case, the correction process similar to the above is performed to capture the user 2 from the front. Similar distance images are obtained.
- the position of the specific point 22 detected from the distance image is in an imaging coordinate system defined for the distance image obtained by the distance image sensor 4. Coordinate conversion is performed from the coordinate position to the coordinate position of the display coordinate system defined in the virtual space.
- the virtual space here corresponds to a rectangular parallelepiped space including the standing position of the user 2 in front of the display device 3, and is a three-dimensional coordinate having the front-rear direction, the left-right direction, and the up-down direction of the user 2 as coordinate axes. It is a space of Cartesian coordinate system.
- the first extraction unit 54 converts the position of the specific point 22 from the polar coordinate system on which the distance image sensor 4 is a reference to the three-dimensional orthogonal coordinate system defined in the virtual space, using a predetermined conversion formula. After that, the first body model 21 is generated.
- the second extraction unit 55 generates a body model (hereinafter referred to as “second body model”) 32 (see FIG. 6) representing the posture of the comparison target 31 in the comparison video. That is, the second extraction unit (reference amount extraction unit) 55 detects the positions of the plurality of sample points 33 in the body of the comparison target (exercise person) 31, and based on the positions of the plurality of sample points, the body of the comparison target 31 Is created, and a reference quantity (second feature quantity) is obtained based on the body model 32.
- second body model hereinafter referred to as “second body model”
- the plurality of sample points 33 in the body of the comparison target (exercise person) 31 are the sample point 33 (33a) of the head of the comparison target 31 and the sample point 33 (33b) of the right shoulder of the comparison target 31.
- the second extraction unit 55 generates the second body model 32 based on the 12 sample points 33a-331 (see FIG. 6
- the second body model 32 representing the posture of the comparison target 31 is generated from the distance image obtained by the distance image sensor 4 in the same manner as the first body model 21 when the comparison image is captured. Similar to the first extraction unit 54, the second extraction unit 55 performs the distance image obtained from the distance image sensor 4 as the preprocessing for detecting the sample point (specific point) 33, and the comparison target 31 from the front. A correction process is performed to correct the distance image obtained by photographing. That is, the second extraction unit 55 generates the second body model 32 by connecting a plurality of specific points 22 in the distance image obtained at the time of capturing the comparison video with a straight line. Similar to the first body model 21, the second body model 32 is generated on a three-dimensional orthogonal coordinate system (display coordinate system) defined in the virtual space. The second body model 32 generated in this way may be stored in advance in the storage unit 51 together with the comparison video. If the reference amount is stored in the storage unit 41 in advance, the second extraction unit 55 can be omitted.
- the generated first body model 21 is displayed by the display control unit 52 at a position overlapping the mirror image 20 of the user 2 on the display surface 30, or on the side or above or below the mirror image 20 of the user 2 on the display surface 30. May be displayed at a position.
- the second body model 32 is also displayed by the display control unit 52 at a position that overlaps the comparison target 31 in the comparison video, or at a position that is lateral or above or below the comparison target 31 on the display surface 30. May be.
- the first extraction unit 54 obtains an angle with respect to a predetermined reference line of a straight line connecting the plurality of specific points 22 from the first body model 21, and uses this angle as a first feature amount. That is, the feature amount (first feature amount) is an angle between a straight line connecting two sample points 22 selected from a plurality of sample points 22 and a predetermined reference line with respect to the first body model 21. is there. For example, when focusing on the specific point 22 (22c) corresponding to the right shoulder in the first body model 21 shown in FIG.
- the angle ⁇ with respect to the reference line of the straight line connecting the right shoulder and the right elbow (the straight line connecting the sample point 22c and the sample point 22b), that is, the angle ⁇ of the right shoulder joint is obtained.
- the straight line connecting the right shoulder and the right elbow (the straight line connecting the sample point 22c and the sample point 22e) is used as a reference.
- an angle ⁇ with respect to a reference line of a straight line connecting the right elbow and the right hand (a straight line connecting the sample point 22e and the sample point 22g), that is, an angle ⁇ of the right elbow joint is obtained.
- the second extraction unit 55 obtains an angle with respect to a predetermined reference line of a straight line connecting the plurality of specific points 33 from the second body model 32, and uses this angle as a second feature amount. That is, the reference amount (second feature amount) is an angle between a straight line connecting two sample points 33 selected from a plurality of sample points 33 and a predetermined reference line with respect to the second body model 32. is there. For example, from the second body model 32 as shown in FIG. 6, the angle ⁇ of the right shoulder joint and the angle ⁇ of the right elbow joint are obtained.
- the first extraction unit 54 extracts an angle about a certain joint obtained from the first body model 21 as a first feature amount, and the second extraction unit 55 about a certain joint obtained from the second body model 32. Are extracted as the second feature amount.
- the feature amount may be the inclination of the body of the user 2.
- the inclination of the body of the user 2 is, for example, the inclination of the shoulder of the user 2 (see FIG. 7).
- the first extraction unit (feature amount extraction unit) 54 is configured to detect the positions of the sample point 22 on the upper right limb and the sample point 22 on the left upper limb of the user 2.
- the sample point 22 of the upper right limb is the sample point 22e of the right elbow
- the sample point 22 of the left upper limb is the sample point 22d of the left elbow.
- the sample point 22 on the upper right limb may be the sample point 22c on the right shoulder, and the sample point 22 on the left upper limb may be the sample point 22b on the left shoulder.
- the first extraction unit 54 is configured to calculate the angle between the straight line L100 connecting the sample point 22 of the upper right limb and the sample point 22 of the left upper limb and the horizontal line L200 as the shoulder inclination of the user 2. .
- the inclination of the shoulder of the user 2 is ⁇ 20 °.
- the inclination of the shoulder of the user 2 is expressed so as to be negative if the inclination of the straight line L100 is positive and positive if the inclination of the straight line L100 is negative.
- the inclination of the body of the user 2 may be, for example, the inclination of the trunk of the user 2 (see FIG. 8).
- the first extraction unit (feature amount extraction unit) 54 detects the positions of the sample point 22a on the head of the user 2 and the sample point 22h on the waist, and the sample point 22a on the head and the sample point on the waist.
- the angle between the straight line L110 connecting the line 22h and the vertical line L210 is calculated as the inclination of the trunk of the user 2.
- the inclination of the trunk of the user 2 is ⁇ 30 °.
- the inclination of the trunk of the user 2 is represented so as to be positive if the inclination of the straight line L110 is positive and negative if the inclination of the straight line L110 is negative.
- the reference amount extraction unit 55 can obtain the body inclination (shoulder or trunk inclination) of the comparison target 31 as the reference amount (second feature amount).
- the feature amount may be a movable range of a specific part of the body of the user 2.
- the specific part is the upper limb of the user 2 (see FIG. 9).
- the specific part is the upper right limb.
- the first extraction unit (feature amount extraction unit) 54 includes the sample point 22 of the upper limb of the user 2 (sample point 22g of the right hand) and the sample point 22 of the shoulder corresponding to the upper limb (sample point 22c of the right shoulder). ) And position.
- the sample point 22 of the upper limb of the user 2 may be the sample point 22e of the right elbow.
- the first extraction unit 54 calculates the angle between the straight line L120 and the vertical line L210 connecting the upper limb sample point 22 (22g) and the shoulder sample point 22 (22c) as the range of motion of the upper limb. Composed. In the example illustrated in FIG. 9, the first extraction unit 54 obtains the movable range of the upper right limb of the user 2 and the value is 70 °.
- the specific part may be the left upper limb.
- the first extraction unit (feature amount extraction unit) 54 samples the upper limb sample point 22 (the left hand sample point 22f and the left elbow sample point 22d) of the user 2 and the shoulder sample point 22 corresponding to the upper limb. What is necessary is just to detect the position with (sample point 22b of the left shoulder).
- the specific part may be the lower limb (right leg or left leg).
- the first extraction unit 54 connects the lower limb sample points 22 (for example, the left knee, left foot, right knee, right foot sample points 22i, 22k, 22j, 22l) and the waist sample point 22h (see FIG.
- the angle between the vertical line L210 and the vertical line L210 is calculated as the range of motion of the lower limb.
- the reference amount extraction unit 55 can obtain the range of motion of a specific part (upper limb or lower limb) of the body of the comparison target 31 as the reference amount (second feature amount).
- the first body model 21 is not limited to a body model generated from a distance image capturing the user 2 from the front, but is a body model generated from a distance image capturing the user 2 from the side. Also good. For example, if the pressure sensor installed at the feet of the user 2 detects the positions of both feet and the center of gravity of the user 2, the body of the user 2 faces the front of the distance image sensor 4 based on the result. You can determine whether you are facing or sideways. From this determination result, the first extraction unit 54 generates the first body model 21 in front view when the user 2 is facing front, and the front view when the user 2 is facing sideways. A lateral first body model 21 is generated by rotating the body model 90 degrees around the vertical axis.
- the movement of the body in the front-rear direction (for example, the forward / backward inclination of the straight line connecting the shoulder and the waist) is also the first feature. It can be extracted as a quantity.
- a lateral body model may be generated.
- the evaluation unit 56 compares the joint angle extracted from the first body model 21 as the first feature amount and the joint angle extracted from the second body model 32 as the second feature amount. Thus, the deviation of the posture between the user 2 and the comparison target 31 is evaluated.
- the evaluation unit 56 can objectively evaluate the posture deviation between the user 2 and the comparison target 31 by comparing the angles of the joints and ignoring the difference in arm length and the like.
- the evaluation unit 56 may compare the angles of a plurality of joints between both body models, or may compare the angles while paying attention to only a specific joint.
- the evaluation unit 56 calculates a difference between the first feature value and the second feature value.
- the evaluation unit 56 may digitize the degree of deviation of the posture between the user 2 and the comparison target 31 according to the calculated difference value, and may determine whether the difference value exceeds a predetermined threshold. It may be determined whether or not there is a misalignment with the comparison target 31.
- the evaluation unit 56 performs the evaluation using the feature amounts extracted from the first body model 21 and the second body model 32, which are three-dimensional models. Not only the deviation but also the deviation in the depth direction (direction orthogonal to the display surface 30) can be evaluated.
- the content presented by the presentation unit 57 may be a determination result as to whether or not the user 2 and the comparison target 31 are misaligned, or may be content representing the degree of misalignment. Further, when the evaluation unit 56 makes a detailed determination as to which part of the body is displaced in which direction by comparing the first feature value and the second feature value, the presentation unit 57. May present advice for reducing the deviation. If the right elbow joint of the user 2 is bent during the exercise of widening the right arm to the side, the presentation unit 57 presents advice such as “Extend your arm”.
- the display control unit 52 performs highlighting by changing the color of a part where the deviation amount of the posture of the user 2 is large in the comparison video.
- the display control unit 52 displays the comparison target 31 in the comparison video.
- the right arm portion is highlighted and the deviation of the right arm portion is presented to the user 2. Accordingly, the user 2 can exercise while paying attention to a portion that is deviated from the comparison target 31, and can easily exercise according to the comparison target 31 in the comparison video.
- the presentation unit 57 feeds back information to the user 2 such as which part of the body of the user 2 is deviated from the comparison target 31 and in which direction, so that the user 2 Can learn the right exercise.
- the presentation unit 57 when the evaluation unit 56 quantifies the degree of the posture deviation between the user 2 and the comparison target 31 according to the difference value between the first feature amount and the second feature amount, the presentation unit 57. May present this numerical value representing the degree of deviation. That is, the evaluation unit (posture evaluation unit) 56 is configured to obtain a numerical value indicating a difference between the feature amount (first feature amount) and the reference amount (second feature amount), and the presentation unit 57 is configured to perform posture evaluation. The numerical value obtained by the unit 56 may be presented. Specifically, the presentation unit 57 presents the difference value between the first feature amount and the second feature amount as it is, or matches the postures of the user 2 and the comparison target 31 obtained according to the difference value. By presenting a score indicating the degree, the degree of deviation and the degree of coincidence can be presented.
- the presentation unit 57 may divide a numerical value representing the degree of coincidence between the postures of the user 2 and the comparison target 31 into a plurality of stages, rank them, and present them.
- the first extraction unit 54 obtains the area of a region surrounded by a straight line connecting a plurality of specific points 22 from the first body model 21 and uses this area as a first feature amount.
- the number of sample points in the first body model 21 may be at least three, and the feature amount (first feature amount) may be an area of a region defined by a plurality of sample points. For example, when focusing on the right shoulder, right elbow, right hand, trunk, right knee, and right ankle in the first body model 21 shown in FIG. 5, a region surrounded by a straight line connecting these specific points 22 (FIG. 5). The area of (hatched area) is obtained.
- the second extraction unit 55 includes an area surrounded by a straight line connecting the right shoulder, the right elbow, the right hand, the trunk, the right knee, and the right ankle from the second body model 32 shown in FIG. Part), and this area is set as the second feature amount. That is, the number of sample points in the second body model 32 may be at least three, and the reference amount (second feature amount) may be an area of a region defined by a plurality of sample points.
- the evaluation unit 56 compares the area extracted from the first body model 21 as the first feature quantity with the area extracted from the second body model 32 as the second feature quantity.
- the deviation of the posture between the user 2 and the comparison target 31 is evaluated. Since the area thus obtained represents a tendency of posture (such as the degree of opening of the body), the evaluation unit 56 roughly evaluates the posture deviation between the user 2 and the comparison target 31 by comparing the above areas. can do.
- the areas of a plurality of regions may be compared between both body models, or the areas of a single region may be compared.
- the exercise support system 1 includes the display device 3 that displays an image on the display surface 30 that is disposed in front of the user 2 and faces the user 2, and the user with respect to the display surface 30.
- the display device 3 displays an image on the display surface 30 that is disposed in front of the user 2 and faces the user 2, and the user with respect to the display surface 30.
- the storage unit 51 that stores the image of the comparison object 31 that performs the same exercise as the exercise performed by the user 2 as a comparison image
- a display control unit 52 for displaying video.
- the exercise support system 1 detects the position of a specific point on the body of the user 2 and extracts a first feature value representing the posture of the user 2 based on the position of the specific point;
- a second extraction unit 55 that detects the position of the specific point of the comparison target 31 and extracts a second feature amount representing the posture of the comparison target 31 based on the position of the specific point; a first feature amount and a second feature amount;
- the evaluation part 56 which compares the feature-value of this and evaluates the attitude
- the exercise support system 1 stores a display device 3 having a display surface 30 that displays an image on the user 2 and a comparison image that is an image of an exerciser (comparison target 31) performing a predetermined exercise.
- Storage unit (comparison image storage unit) 51 a display control unit (comparison image display unit) 52 that displays the comparison image stored in the comparison image storage unit 51 on the display surface 30, and a mirror image of the user 2 as a comparison image
- a mirror image display means for displaying in a superimposed manner, and a feature amount (first feature amount) representing the posture of the user 2 based on the position of the sample point detected on the body of the user 2
- the first extraction unit (feature amount extraction unit) 54 to be obtained and the feature amount obtained by the feature amount extraction unit 54 are compared with a reference amount (second feature amount) representing the posture of the exerciser.
- Evaluation unit (posture evaluation) that evaluates the deviation of the posture from the posture of the athlete ) Comprises a 56, a presentation unit 57 for presenting
- the mirror image display means is the half mirror 6 disposed on the front surface of the display device 3.
- the exercise support system 1 of the present embodiment includes a reference amount extraction unit (second extraction unit) 55.
- the reference amount extraction unit 55 detects the position of a predetermined sample point (specific point) in the body of the exerciser (comparison target 31) in the comparison image, and indicates the reference amount (the amount of the exerciser's posture based on the position of the sample point). The second feature amount is determined.
- the user 2 can exercise while comparing his / her mirror image 20 reflected in the half mirror 6 and the comparison target 31 in the comparison image reflected in the display surface 30. Therefore, it becomes easy to exercise according to the comparison target 31. As a result, the user 2 has an advantage that it becomes easy to learn the correct posture (movement) during exercise, and the effect of exercise can be sufficiently obtained.
- the evaluation unit 56 compares the first feature amount representing the posture of the user 2 with the second feature amount representing the posture of the comparison target 32, and determines the posture deviation between the user 2 and the comparison target 31. Evaluate and present the evaluation result to the presentation unit 57. Therefore, the user 2 can recognize the deviation of the posture from the comparison target 31. That is, the user 2 can recognize how much the user 2 is different from the posture of the comparison target 31 in the comparison video. Therefore, when the user 2 is exercising greatly deviating from the posture of the comparison target 31, the user 2 can be made aware of this, so the user 2 can improve the movement. This makes it easier to learn the correct posture during exercise.
- the comparison video is a video of the comparison target 31 that shows an example of exercise to be performed by the user 2. That is, the predetermined exercise is an exercise that serves as an example of the exercise performed by the user 2.
- the first extraction unit 54 and the second extraction unit 55 each detect the position of a plurality of specific points and connect the specific points to generate a body model, Extract features from the model. That is, the feature amount extraction unit (first extraction unit) 54 detects the positions of a plurality of sample points (specific points) in the body of the user 2 and determines the body of the user 2 based on the positions of the plurality of sample points. A body model (first body model) 21 to be shown is created, and a feature amount (first feature amount) is obtained based on the body model 21.
- the reference amount extraction unit (second extraction unit) 55 detects the positions of a plurality of sample points (specific points) in the body of the comparison target 31 of the comparison image, and compares the comparison target 31 based on the positions of the plurality of sample points.
- a body model (second body model) 32 indicating the body of the user is created, and a reference amount (second feature amount) is obtained based on the body model 32.
- the first extraction unit 54 and the second extraction unit 55 use an angle of a straight line connecting a plurality of specific points with respect to a predetermined reference line as a feature amount. That is, the feature amount is an angle between a straight line connecting two sample points selected from a plurality of sample points and a predetermined reference line.
- the feature amount may be the inclination of the body of the user 2.
- the inclination of the body of the user 2 may be the inclination of the shoulder of the user 2.
- the feature quantity extraction unit 54 detects the positions of the sample point 22 of the upper right limb and the sample point 22 of the upper left limb of the user 2 and connects the sample point 22 of the upper right limb and the sample point 22 of the left upper limb.
- the angle between the straight line L100 and the horizontal line L200 is calculated as the shoulder inclination of the user 2.
- the inclination of the body of the user 2 may be the inclination of the trunk of the user 2.
- the feature amount extraction unit 54 detects the positions of the sample point 22 on the head of the user 2 and the sample point 22 on the waist, and connects the sample point 22 on the head and the sample point 22 on the waist with a straight line L110. And an angle between the vertical line L210 and the trunk 2 of the user 2 is calculated as the inclination of the trunk of the user 2.
- the reference amount extraction unit 55 may obtain the body inclination (shoulder or trunk inclination) of the comparison target 31 as the reference amount (second feature amount). Good.
- the feature amount may be a range of motion of a specific part of the body of the user 2.
- the specific part may be the upper limb of the user 2.
- the feature amount extraction unit 54 detects the position of the sample point 22 of the upper limb of the user 2 and the sample point 22 of the shoulder corresponding to the upper limb, and connects the sample point 22 of the upper limb and the sample point 22 of the shoulder.
- An angle between the straight line L120 and the vertical line L210 is configured to be calculated as the range of motion of the upper limb.
- the reference amount extraction unit 55 may obtain a movable range of a specific part (for example, an upper limb) of the comparison target 31 as a reference amount (second feature amount). .
- the first extraction unit 54 and the second extraction unit 55 use the area of a region surrounded by a straight line connecting a plurality of specific points as a feature amount. That is, the number of sample points is at least three or more, and the feature amount is an area of a region defined by a plurality of sample points.
- the display control unit 52 positions the comparison video 31 on the display surface 30 so that the user 2 can see the image of the user 2 and the image of the comparison target 31 overlapping. And adjust the size. That is, the comparison image display unit (display control unit) 52 is configured to adjust the position and size of the comparison image on the display surface 30 so that the comparison image overlaps the mirror image of the user.
- the evaluation unit 56 digitizes the difference between the first feature value and the second feature value and causes the presentation unit 57 to present the difference. That is, the posture evaluation unit (evaluation unit) 56 is configured to obtain a numerical value indicating a difference between the feature amount (first feature amount) and the reference amount (second feature amount), and the presentation unit 57 is configured to perform posture evaluation. It is comprised so that the numerical value calculated
- the 1st extraction part 54 and the 2nd extraction part 55 detect the position of the specific point 22 from the user 2 or the whole body of the comparison object 31, and respond to the user 2 or the whole body of the comparison object 31
- the first extraction unit 54 sets the coordinate position of the specific point 22 of the user 2 as the first feature amount
- the second extraction unit 55 sets the coordinate position of the specific point 22 of the comparison target 31 as the second feature amount. It may be configured as follows.
- the evaluation unit 56 compares the first feature quantity with the second feature quantity, thereby shifting the specific point 22 corresponding to the same part of the body between the user 2 and the comparison target 31. To evaluate. For example, when the positions of the trunks of the user 2 and the comparison target 31 are matched and the coordinate position of the specific point 22 corresponding to the right hand is used as the first and second feature quantities, the evaluation unit 56 determines both feature quantities. Of the right hand between the user 2 and the comparison target 31 is evaluated. That is, the evaluation unit 56 obtains the relative distance between the specific points 22 corresponding to the same part of the body between the user 2 and the comparison target 31, so that the posture between the user 2 and the comparison target 31 is obtained. Evaluate deviations.
- the evaluation unit 56 determines whether the specific point 22 corresponding to the right hand is higher than the specific point 22 corresponding to the right shoulder. Based on the relative positional relationship between the user 2 and the comparison object 31, the deviation of the posture may be evaluated. In this case, when the specific point 22 corresponding to the right hand is located at a position lower than the specific point 22 corresponding to the right shoulder of the user 2 in the exercise of spreading the right arm largely sideways, for example, Please raise it higher. "
- the acquisition unit 53 captures the user 2 from the front. And a distance image captured from the side may be acquired at the same time.
- the left-right inclination of the body is extracted as a feature amount from the distance image capturing the user 2 from the front
- the front-rear inclination of the body is extracted as the feature amount from the distance image capturing the user 2 from the side.
- the 1st extraction part 54 is not restricted to the structure which detects the position of the specific point of the user 2 using the distance image obtained with the distance image sensor 4 as mentioned above.
- the first extraction unit 54 may be a two-dimensional image of the user 2 photographed by a two-dimensional camera such as a CCD (Charge Coupled Device) camera, a gyro sensor type motion capture attached to the user 2, or the like.
- the position of the specific point of the user 2 may be detected using the sensor output.
- the second extraction unit 55 may be configured to detect the position of the specific point of the comparison target 31 from other than the distance image.
- the exercise support system 1A of this embodiment is different from the exercise support system 1 of Embodiment 1 in that the half mirror 6 is not provided.
- an imaging device 7 is provided that is arranged in front of the user 2 and has a lens directed in a direction to image the user 2 from the front. .
- the imaging device 7 is configured to capture the user 2 and generate an image of the user 2.
- the imaging device 7 is installed at the eye level of the user 2 on the front side (user 2 side) of the display device 3 using, for example, a camera stand. Further, the imaging device 7 includes the whole body of the user 2 in the field of view, and the center line in the left-right direction of the body of the user 2 in a state where the user 2 is standing upright is the center line in the left-right direction of the captured image The tilt angle and pan angle are adjusted so as to coincide with.
- the adjustment of the position and orientation of the imaging device 7 as described above is performed as an initial setting after the standing position of the user 2, the height of the line of sight, etc. are determined. As a result, the imaging device 7 captures a moving image (hereinafter referred to as “whole body image”) showing the whole body of the user 2.
- the control device 5A is connected to both the display device 3 and the imaging device 7, and has a function of processing the image captured by the imaging device 7 and displaying the processed image on the display device 3.
- the control device 5A includes a reversal processing unit 58 that obtains a whole body video from the imaging device 7 by the obtaining unit 53, and generates a reversed video by reversing the obtained whole body video left and right. That is, the inversion processing unit 58 is configured to generate a horizontally inverted image by horizontally inverting the image of the user 2 generated by the imaging device 7. Further, the control device 5A causes the display device 52A to display the reverse video on the display device 3 so that the horizontal center line of the reverse video matches the horizontal center line of the display surface 30.
- the display control unit 52 ⁇ / b> A functions as an inverted image display unit that displays the horizontally inverted image generated by the inversion processing unit 58 on the display surface 30.
- the image of the whole body of the user 2 is displayed on the display surface 30 of the display device 3 so as to be horizontally reversed like a mirror image reflected in the mirror.
- the image picked up by the image pickup apparatus 7 is not reversed left and right like a mirror image unless any processing is performed, and the right and left viewed from the user 2 facing the display surface 30 and the display are displayed. This is opposite to the left and right in the image displayed on the surface 30.
- the image on the display surface 30 is displayed. Then, the left half of the user 2 is shown on the right side, and the right half of the user 2 is shown on the left side.
- the display device 3 shows the right half of the user 2 on the right side and the left half of the user 2 on the left side on the display surface 30. Display the reverse video so that it appears.
- the display device 3 can cause the user 2 to visually recognize the inverted video image displayed on the display surface 30 so that the user 2 can have the illusion of the inverted video image as a mirror image of his / her whole body.
- the control device 5A processes (inverts) the video input from the imaging device 7 in real time (about 15 to 30 frames per second), and outputs a video signal to the display device 3.
- the display device 3 receives the video signal from the control device 5A and displays a reverse video in real time. Therefore, a moving image that moves in accordance with the actual movement of the user 2 is displayed on the display surface 30 of the display device 3 as a reverse video. For example, when the user 2 raises his right arm, the right arm as seen from the user 2 also rises in the reverse video displayed on the display surface 30 in accordance with the movement.
- the exercise support system 1A of the present embodiment does not present a mirror image that is optically formed, causes the user 2 to visually recognize the reverse video displayed on the display device 3, and
- the reverse video can be illusioned with its own mirror image.
- the display control unit 52A causes the display device 3 to display a comparison image indicating the comparison target 31 that performs the same exercise as the exercise to be performed by the user 2 together with the inverted image generated by the inversion processing unit 58.
- the display control unit 52A uses the size adjustment unit and the position adjustment unit so that the image of the user 2 in the inverted image and the image of the comparison target (instructor) 31 in the comparison image overlap on the display surface 30. 30 adjust the size and position of the comparison video.
- the display control unit 52A displays one of the inverted video and the comparative video as a semi-transparent video (for example, a transmittance of 50%) so that the inverted video and the comparative video can be easily distinguished.
- the exercise support system 1A of the present embodiment described above includes the display device 3 that displays an image on the display surface 30 that is disposed in front of the user 2 and faces the user 2, and the user 2 that is disposed in front of the user 2.
- An imaging device 7 that captures an image
- an inversion processing unit 58 that generates an inverted image by horizontally inverting the image of the user 2 captured by the imaging device 7, and an image of the comparison target 31 that performs the same exercise as the user 2 Is stored as a comparison image
- a display control unit 52A is displayed that causes the display device 3 to display the comparison image together with the inverted image when the user 2 exercises.
- the exercise support system 1 detects the position of a specific point on the body of the user 2 and extracts a first feature value representing the posture of the user 2 based on the position of the specific point;
- a second extraction unit 55 that detects the position of the specific point of the comparison target 31 and extracts a second feature amount representing the posture of the comparison target 31 based on the position of the specific point; a first feature amount and a second feature amount;
- the evaluation part 56 which compares the feature-value of this and evaluates the attitude
- the exercise support system 1A of the present embodiment stores a comparison image that is an image of the display device 3 having the display surface 30 that displays an image on the user 2 and an exerciser (comparison target 31) that performs a predetermined exercise.
- a mirror image display means for displaying in a superimposed manner, and a feature amount (first feature amount) representing the posture of the user 2 based on the position of the sample point detected on the body of the user 2
- the first extraction unit (feature amount extraction unit) 54 to be obtained and the feature amount obtained by the feature amount extraction unit 54 are compared with a reference amount (second feature amount) representing the posture of the exerciser.
- the mirror image display means captures the user 2 and generates an image of the user, and horizontally inverts the image of the user 2 generated by the imaging device 7.
- An inversion processing unit 58 that generates a horizontally inverted image
- an inverted image display unit (display control unit) 52A that displays the horizontally inverted image generated by the inversion processing unit 58 on the display surface 30.
- the configuration can be simplified compared to the exercise support system 1 of the first embodiment by the amount that the half mirror 6 is omitted.
- an existing display can be used as the display device 3 without newly installing a dedicated display. The cost of introducing an exercise support system can be reduced.
- the comparison video may be a video that compares, for example, computer graphics representing a human body or the second body model described above.
- the user 2 exercises in accordance with the movement of the computer graphics or the second body model in the comparison video instead of the instructor video as described above.
- control device 5A stores, in the storage unit 51, an image of the user 2 at the time of exercise captured by an imaging device arranged in front of the user 2, and displays this image as a comparison image at the next exercise. You may make it display on the display apparatus 3 in the control part 52.
- FIG. the exercise support system 1A according to the present embodiment includes the imaging device 7 that is arranged in front of the user 2 and captures the video of the user 2, and the comparison video is the image of the user captured by the imaging device 7 in the past. It may be a video.
- the exercise support system 1 ⁇ / b> A captures the user 2 who performs a predetermined exercise and generates an image of the user 2 who performs the predetermined exercise, and the reference amount.
- the comparison image display unit (display control unit) 52A is configured to display the recorded image generated by the imaging device 7 on the display surface 30 as a comparison image.
- the reference amount extraction unit 55 is configured to detect the position of a predetermined sample point on the body of the user 2 from the recorded image and obtain a feature amount representing the posture of the user as a reference amount based on the position of the sample point.
- the video of the user 2 captured by the imaging device during the past (previous) exercise is displayed on the display surface 30 as a comparative video.
- the user 2 can exercise while comparing his / her current self with the past self. Therefore, the user 2 can exercise using a comparison object having the same physique (arm length, etc.) as himself as a sample, so that the posture of the user 2 is higher than when the instructor image is used as a comparison image. It becomes easy to match (movement) to the comparison target.
- the evaluation unit 56 compares the first feature value representing the posture of the user 2 and the second feature value representing the posture of the comparison target, thereby comparing the current user 2 and the comparison target. As a result, the deviation of the posture is evaluated with the past video of the user 2. Therefore, the evaluation unit 56 can evaluate how the posture of the user 2 during exercise has changed compared to the conventional one, and can evaluate the progress of rehabilitation.
- the exercise support system 1B of the present embodiment has a side surface as a range of motion training system.
- the range-of-motion training system is used for range-of-motion training for returning the range of motion of a specific part of the body, such as the extremity, to the normal range or maintaining it within the normal range.
- Range-of-motion training is generally adopted to prevent or improve the movement range (range of motion) of patients' limbs due to illness or injury.
- Range of motion training is to move the specific area of the patient's training target continuously (for example, every day) to return the normal range of movement of the specific area of the body, such as the limbs, to the normal range. Done.
- the range of motion of a specific part of the patient is usually measured, and the measurement result is used for determining the degree of failure and the effect (recovery) of the training.
- the range of motion of a specific part for example, the height of the arm when the patient raises the arm, the relative angle between the upper arm and the forearm when the elbow joint is bent, The method of using and measuring is common.
- an angle measuring instrument used for this type of measurement an angle measuring instrument equipped with an inclination angle meter that identifies the direction of gravity with respect to the arm and configured to allow the measurer to derive the inclination angle of the upper arm and the forearm relative to the vertical direction has been proposed.
- Document 2 Japanese Patent Publication No. 4445468
- the relative angle between the entire human body and the upper arm or the forearm is further defined when measuring the relative angle between the upper arm and the forearm. This makes it possible to make detailed judgments.
- the patient can only know the measurement result (angle, etc.) of the range of motion, and the specialist such as a therapist evaluates the range of motion of the specific part from the measurement result. Without it, you will not be able to fully understand what the measurement results mean. For example, if a measurement result indicating how many centimeters the arm has been raised is obtained, the patient can understand how much the range of motion has recovered or deteriorated from the measurement result. Therefore, the necessity and effect of range of motion training cannot be fully understood.
- the range of motion training system can present the evaluation result of the range of motion of a specific part of the body to the user.
- the exercise support system 1B of the present embodiment is used as a range of motion training system.
- the range-of-motion training system is used for rehabilitation aimed at preventing or improving the limitation of the range of movement of the patient's limbs (range of motion) due to illness or injury.
- range of motion training system for example, training for expanding the range of motion of the limbs for healthy people, and the range of motion of the limbs will be limited over time
- a range of motion training system may be used to prevent this.
- an example of using the range-of-motion training system in a posture where the user stands is shown, but the present invention is not limited thereto, and the user may use the range-of-motion training system in a posture sitting on a chair or the like.
- the exercise support system 1B of the present embodiment is different from the exercise support system 1 of the first embodiment in a control device 5B.
- the exercise support system 1B according to the present embodiment includes a display device 3 that displays an image on a display surface 30 that is disposed in front of a user (patient) 2 and faces the user 2, a distance image sensor 4 that generates a distance image, and a display. And a control device 5B for controlling the operation of the device 3 and the like.
- the display device 3 and the distance image sensor 4 are both connected to the control device 5B.
- the exercise support system 1B of the present embodiment includes a half mirror 6 which is a mirror image display means.
- the distance image of the user 2 generated by the distance image sensor 4 is output to the control device 5B and used for processing for detecting the position of a specific part of the body of the user 2. It is done.
- control device 5B includes a computer, and includes an acquisition unit 53 that acquires a distance image from the distance image sensor 4, and a position detection unit 152 that detects the position of a specific part using the acquired distance image. is doing.
- the range-of-motion training system 1B of the present embodiment includes an index setting unit 153 that sets an index at a position on the display surface 30 corresponding to the position of the specific part of the body of the user 2 detected by the position detection unit 152.
- the control device 5B includes a video generation unit 154 that generates a graphic image and displays it on the display device 3.
- the control device 5B includes a storage unit 51, an acquisition unit 53, a first extraction unit 54, a second extraction unit 55, an evaluation unit 56, a presentation unit 57B, and a position detection unit.
- the first extraction unit 54, the second extraction unit 55, and the evaluation unit (posture evaluation unit) 56 are not necessarily provided.
- the position detection unit 152 is configured to detect the position of a specific part of the body of the user 2.
- the position detection unit 152 detects the position of a specific part of the body of the user 2 in the distance image by image recognition technology.
- the specific part may be a specific part of the body of the user 2, and may be a leg part, a head, or the like, for example. Since the distance image of the user 2 generated by the distance image sensor 4 is a moving image that changes as the user 2 moves, the position detection unit 152 detects the position of a specific part from each frame of the distance image. Thus, the position of the specific part that changes as needed can be detected in real time.
- the position detection unit 152 detects the position of the joint of the user 2 and connects the joints with a straight line to generate a body model representing the position of a specific part in the three-dimensional space. That is, for example, when the position detection unit 152 detects the position of the right arm of the user 2 as a specific part, the position detection unit 152 detects the position of the user's right shoulder joint, right elbow joint, and right wrist, respectively. The body model corresponding to the right arm of the user 2 is generated. By using such a body model, the position detection unit 152 detects the position of the specific part of the user 2 in the three-dimensional space from the output of the distance image sensor 4 as a three-dimensional sensor.
- the index setting unit 153 is configured to determine the position of the index on the display surface 30 based on the position detected by the position detection unit 152.
- the index setting unit 153 displays at least a part of the specific part on the display surface 30 from the coordinate position of the imaging coordinate system defined for the distance image obtained by the distance image sensor 4.
- An index is set at a position obtained by coordinate conversion to a coordinate position in the coordinate system.
- the display coordinate system here is a two-dimensional orthogonal coordinate system in which the horizontal and vertical directions of the display surface 30 of the display device 3 are coordinate axes.
- the index setting unit 153 uses a predetermined conversion formula to obtain a display obtained by performing coordinate conversion of at least a part of a specific part from the polar coordinate system on which the distance image sensor 4 is a reference to a two-dimensional orthogonal coordinate system. An index is set at a position on the surface 30. Thereby, an index is set at a position on the display surface 30 corresponding to the position of the specific part detected by the position detection unit 152.
- the specific site is the entire arm
- the index setting unit 153 sets the index on the display surface 30 corresponding to the position of the wrist that is a part of the specific site (arm). Is the position.
- the index setting unit 153 may be configured to set an index at a position on the display surface 30 corresponding to at least a part of the specific part, and the index is set at a position on the display surface 30 corresponding to the entire specific part. The configuration may be set.
- the index setting unit 153 may set the index at a position on the display surface 30 corresponding to the position of the entire arm. In this case, since the arm portion as the specific part has a certain size, the index is set within a certain range on the display surface 30.
- the index setting unit 153 in the present embodiment calibrates the position of the index so that the index is set at a position that overlaps the mirror image of the specific part of the mirror image of the user 2 reflected on the half mirror 6 on the display surface 30. It has a function. In other words, the position of the mirror image reflected on the half mirror 6 as viewed from the user 2 changes depending on the positional relationship between the half mirror 6 and the user 2, so that the index setting unit 153 appropriately sets the position determined by the coordinate conversion described above. The position of the index is calibrated by giving an offset of.
- the position of the index on the display surface 30 is calibrated according to the positional relationship between the half mirror 6 and the user 2.
- the positional relationship between the half mirror 6 and the user 2 may be determined in advance so that the user 2 stands at the front center of the half mirror 6.
- a pressure sensor (not shown) that detects the position of the center of gravity of the user 2 may be added, and the positional relationship between the half mirror 6 and the user 2 may be obtained from the detection result of the pressure sensor.
- the control device 5B includes a height registration unit (not shown) in which a uniform value (for example, 170 cm) is registered in advance as the height of the user 2.
- the height registered in the height registration unit may be directly input by the user 2.
- the index setting unit 153 calibrates the index to a position based on the height of the user 2, so that the index setting unit 153 calibrates itself from the viewpoint of the user 2. It is possible to set an index at a position overlapping a specific part of the mirror image.
- the half mirror 6 can be obtained from the position and orientation of the distance image camera 4 and the standing position and the top position of the user 2 detected from the distance image.
- the position of a specific part (for example, the head) of the mirror image shown in FIG. Accordingly, the index setting unit 153 can calibrate the position of the index to a position overlapping with a specific part of its own mirror image as viewed from the viewpoint of the user 2.
- the calibration of the index position may be performed manually at the time of initial setting or may be automated.
- the video generation unit 154 includes a determination unit 1541 and a display control unit 52B.
- the determination unit 1541 is configured to determine whether or not the index is located at a predetermined position on the display surface 30.
- the display control unit 52B functions as an event image display unit that displays a predetermined event image at a predetermined position when the determination unit 1541 determines that the index is positioned at a predetermined position.
- the video generation unit 154 generates a graphic image associated with the process and displays it on the display device 3 so as to execute a predetermined process when the position of the index set by the index setting unit 153 overlaps. That is, the video generation unit 154 displays an appropriate icon on the display device 3 and associates the process to be executed with the icon, so that the index position specified by the index setting unit 153 is associated with the icon. When they overlap, the process associated with the icon can be executed.
- the video generation unit 154 displays a large number of images 131 representing the design of the balloon within a predetermined range on the display surface 30 at substantially equal intervals, as shown in FIG. As a result, the mirror image of the user 2 is reflected on the front surface of the half mirror 6, and the graphic image 131 generated by the video generation unit 154 is displayed through the half mirror 6.
- the icon 131 is not a still image of a balloon, but a movie of a balloon that fluctuates as if it is floating in the air.
- the icon 131 is associated with a process of erasing the balloon pattern with an animation that causes the balloon to burst when the positions of the indices overlap.
- the graphic 131 is associated with a process for generating a sound corresponding to a change in the pattern from a speaker (not shown) of the control device 5B, for example, a bursting sound of a balloon bursting when the positions of the indices overlap. It may be done.
- the icon 131 may be associated with a process for randomly changing the color of the balloon to a rainbow color.
- the brightness of the display device 3 is set so that there is no great difference in the appearance of the mirror image reflected on the half mirror 6 and the image 131 displayed on the display device 3 when viewed from the user 2. It is desirable that the brightness of the room or the room is appropriately adjusted.
- the position of the index set by the index setting unit 153 may be used only in the internal processing of the control device 5B (processing of the graphic image 131), and it is indispensable that the display itself is displayed on the display device 3. Absent. However, the video generation unit 154 may display a mark 132 of an appropriate shape (for example, a circle) on the display device 3 at the position where the index is set as shown in FIG. Can reliably recognize the position of the index.
- an appropriate shape for example, a circle
- the user 2 moves an index on the display surface 30 by moving a specific part (here, an arm part) while looking at the mirror image of the user 2 reflected on the half mirror 6. be able to.
- a specific part here, an arm part
- the balloon 2 is ruptured by the icon 131 that overlaps with the position of the index. Therefore, the user 2 looks at the icon 131 displayed on the display surface 30.
- the moved range of the specific part can be visually recognized. In this embodiment, since the specific part is an arm part, the user 2 can visually recognize how much his / her arm is raised.
- the user 2 moves the specific part while the image generation unit 154 displays the icon 131 on the display device 3 to prevent the range of motion of the specific part of the body such as the limb from being limited.
- Range of motion training can be performed.
- a timer (not shown) is displayed after a predetermined operation for starting the range of motion training is performed on the input interface (not shown) of the control device 5B.
- the predetermined training time counted.
- it is desirable that the remaining training time is presented to the user 2 by being displayed on the display surface 30 of the display device 3 by the video generation unit 154.
- the exercise support system 1B of the present embodiment includes an evaluation unit (second evaluation unit) 155 that evaluates the range of motion of the specific part by comparing the evaluation target obtained from the change of the position of the specific part with a predetermined evaluation criterion.
- the control device 5B further includes a presentation unit 57 that presents the evaluation result of the evaluation unit 155.
- the evaluation unit 155 includes an evaluation data creation unit 1551 that creates evaluation data indicating the range of motion of the specific part based on the position detected by the position detection unit 152, and the evaluation data and reference data created by the evaluation data creation unit 1551. And a range of motion evaluation unit 1552 that evaluates the range of motion of the specific part based on the comparison.
- the evaluation unit 155 is based on the range in which the specific part of the user 2 detected by the position detection unit 152 actually moves during the period (training time) in which the image generation unit 154 displays the icon 131 on the display device 3.
- the movable range (movable range) of a specific part is evaluated. That is, the evaluation data includes data indicating the movable range of the specific part in the predetermined direction.
- a score is associated with the icon 131 displayed on the display surface 30 in advance, and the evaluation unit 155 corresponds to the icon 131 in which the process of bursting the balloon due to the overlapping of the index positions is performed.
- the score is calculated, and this score is evaluated as the range of motion of the arm (specific part).
- the rising height of the arm of the user 2 is directly related to the range of motion of the arm, and it can be evaluated that the range of motion of the arm is wider as the arm is raised higher. Therefore, in the present embodiment, the score is assigned according to the position of the icon 131 on the display surface 30 so that the icon 131 having a higher display position on the display surface 30 has a higher score, and the highest score among the acquired scores.
- the score is the score of user 2. Thereby, the score which can be acquired becomes high, so that the user 2 raises the front-end
- the evaluation unit 155 evaluates the range of motion of the arm part in the form of a score, with the height of the arm part from the floor as an evaluation target.
- the score assignment for each icon 131 is stored in the storage unit 51 of the control device 5B, and the evaluation unit 155 calculates the score of each icon 131 according to the score assignment read from the memory unit 51.
- 100 points are assigned to the image 131 displayed at the highest position on the display surface 30 so that the standard score that a healthy person can acquire is 100 points, and displayed at the lowest position on the display surface 30.
- 20 points are assigned to the image 131 to be displayed. That is, in the storage unit 51 as the standard storage unit, the assignment of the score to the image 31 is stored as standard information indicating the standard of the evaluation target (arm height) of the evaluation unit 155, and the evaluation unit 155 Is compared with the standard information to evaluate the range of motion of the arm relative to the standard.
- the storage unit 51 may store standard information for each age, sex, and height, for example.
- the evaluation unit 155 sets the standard information to be compared with the evaluation target to the age, sex, and height of the user 2. Select accordingly. That is, the assignment of points to the icon 131 varies depending on the age, sex, and height of the user 2.
- the evaluation unit 155 evaluates the movable range (movable range) of the specific part in a predetermined direction, as described above, a score corresponding to the height of the specific part of the user 2 from the floor is given. For example, there is a method of measuring the moving distance of the specific part of the user 2 in a certain direction. Specifically, if the specific part is the arm part, the evaluation unit 155 raises the arm part from the position (initial position) of the tip part of the arm part when the user 2 lowers the arm part. The distance in the vertical direction to the position of the tip of the arm portion may be measured, and this distance may be the evaluation target.
- the evaluation unit 155 can evaluate that the movable range of the arm portion is wider as the measured distance increases, and therefore, the score is assigned so that the score is higher as the graphic 131 is farther from the initial position. It only has to be. Furthermore, the distance in the horizontal direction (left-right direction) from the center line of the trunk to the tip of a specific part (for example, the right arm) may be considered as an evaluation target.
- the evaluation unit 155 is configured to evaluate the movable range of the specific part, with the area of the region through which the specific part passes in the plane (two-dimensional space) along the display surface 30 as an evaluation target. Also good. That is, the evaluation data may include data indicating the area of a region through which a specific part has passed in a plane parallel to the display surface 30. Specifically, the index setting unit 153 sets an index at a position on the display surface 30 corresponding to the entire specific part (arm portion), and the evaluation unit 155 passes the index in a plane along the display surface 30. The area of the region is to be evaluated.
- the evaluation unit 155 calculates the total score of the icon 131 in which the process of bursting the balloon due to the overlapping of the index positions is performed, and this score is evaluated as the range of motion of the specific part. Even in this case, a score is assigned to each icon 131 so that the standard score that a healthy person can acquire is 100, and the evaluation unit 155 moves the arm by comparing the evaluation object with the standard information. The range is evaluated relative to the standard.
- the evaluation unit 155 passes the specific part in the three-dimensional space.
- the volume of the selected area may be included in the evaluation target. That is, the evaluation data may include data indicating the volume of the region through which the specific part has passed. That is, the evaluation unit 155 evaluates that the movable range of the specific part is wider as the volume is larger, based on the volume that is quantified in the three-dimensional space.
- the evaluation unit 155 converts the volume into, for example, a score of 100 points based on the standard information in the storage unit 51, and evaluates the standard score that can be obtained by a healthy person to 100 points. Is compared with the standard information to evaluate the range of motion relative to the standard relative to the standard.
- the evaluation unit 155 may include, as an evaluation target, the time required for the specific part to perform a predetermined operation, such as an operation in which the arm part as the specific part reciprocates once in the vertical direction. That is, the evaluation data may include data indicating the time required for the user 2 to perform a predetermined operation at the specific part.
- the distance from the initial position (for example, the tip position with the arm lowered) to a target position (for example, the position of the tip with the arm raised to the maximum) is divided by the movement time.
- the operation speed of the part can be obtained, and this operation speed can be used as an index of the degree of recovery.
- the evaluation unit 155 evaluates that the movable range of the specific part is wider as the time is shorter, based on the time required for the specific part to perform the predetermined operation. In this case, the evaluation unit 155 converts the time into, for example, a score of 100 points based on the standard information in the storage unit 51, so that the standard score that can be obtained by a healthy person is 100 points. Is compared with the standard information to evaluate the range of motion relative to the standard relative to the standard.
- the evaluation unit 155 may include a trajectory (including both the two-dimensional space and the three-dimensional space) of the movement of the arm portion as the specific part in the evaluation target. That is, the evaluation data may include data indicating the locus of the specific part.
- the storage unit 51 stores a plurality of patterns of standard arm movement paths of healthy persons as standard information, and the evaluation unit 155 sets the deviation from the standard information to be evaluated to a score of 100 points, for example. Convert.
- the presenting unit 57B presents the evaluation result of the range of motion of the specific part made by the evaluating unit 155 to the user 2 in this way. That is, the presentation unit 57B is configured to present the result of evaluation by the evaluation unit (range of motion evaluation unit) 155. Specifically, the presentation unit 57B presents the evaluation result of the evaluation unit 155 to the user 2 by voice or light. Or it is set as the structure which displays the evaluation result of the evaluation part 155 on the display apparatus 3 by the instruction
- the content presented by the presenting unit 57B may be only the evaluation result of the range of motion of the specific part, or in addition to the evaluation result, the height (distance), area, volume, time, trajectory, etc., which are the evaluation targets.
- the numerical value shown quantitatively may be included. For example, when the evaluation unit 155 gives a score according to the height of the arm of the user 2 from the floor by the balloon image 131 as described above, the presentation unit 57B presents only the acquired score. Alternatively, a numerical value indicating the height of the arm of the user 2 or the number of balloons that have been ruptured may be presented together.
- the presentation unit 57B may present together the evaluation object and the evaluation criteria (standard information) used for the evaluation by the evaluation unit 155, or may present a deviation between the two.
- the user 2 can know not only the evaluation result but also the standard value of the evaluation object and the deviation from the standard value, which can be the target of future range of motion training.
- the exercise support system (range of motion training system) 1B includes the display device 3 that displays an image on the display surface 30 that is disposed in front of the user 2 and faces the user 2, and the operation of the user 2.
- a position detection unit 152 that detects the position of a specific part of the body of the user 2 that changes with the index
- an index setting unit 153 that sets an index at a position on the display surface 30 corresponding to at least a part of the position of the specific part.
- an image generation unit 154 that generates a graphic image associated with the process and displays it on the display device 3 so as to execute a predetermined process when the positions of the indices overlap, and a specific part detected by the position detection unit 152
- An evaluation unit 155 that evaluates a range of motion of a specific part by comparing an evaluation object obtained from a change in position with a predetermined evaluation criterion, and a presentation unit 57B that presents an evaluation result in the evaluation unit 155 are provided.
- the exercise support system 1B includes a position detection unit 152 that detects the position of a specific part of the body of the user 2, and an index that determines the position of the index on the display surface 30 based on the position detected by the position detection unit 152.
- an evaluation data creation unit 1551 that creates evaluation data indicating a movable range of a specific part based on the position detected by the position detection unit 152
- an evaluation A range-of-motion evaluation unit 1552 that evaluates the range of motion of the specific part based on the comparison between the evaluation data created by the data creation unit 1551 and the reference data
- Radical 113 57B is configured to present the results of evaluation by the movable range evaluation unit 155.
- the exercise support system 1B further includes a half mirror 6 that is disposed on the user 2 side with respect to the display surface 30 and transmits the image displayed on the display device 3.
- the index setting unit 153 is configured to set an index at a position that overlaps a mirror image of a specific part of the mirror image of the user 2 reflected on the half mirror 6 on the display surface 30.
- the mirror image display means is the half mirror 6 disposed on the front surface of the display device 3.
- the index setting unit 153 is configured to determine the position of the index so that the position of the index corresponds to the position of the display surface 30 that overlaps the specific part reflected on the half mirror 6.
- the evaluation result of the range of motion of the specific part of the body can be presented to the user 2.
- the evaluation unit 155 evaluates the range of motion of the specific part by comparing the evaluation target obtained from the change in the position of the specific part with a predetermined evaluation standard, and the evaluation result is the presentation unit. Feedback is made to user 2 from 57B. Therefore, the user 2 can not only know the measurement result of the movable range of his / her specific part, but also understand what the measurement result means. That is, the user 2 can know not only the measurement result of how many centimeters the arm has raised, but also the number of points that the range of motion is evaluated out of 100 points. And fully understand the effects.
- the user 2 moves the specific part as much as possible within the range of motion without being particularly conscious, so the user 2 moves the body as if enjoying a game. Can enjoy the effect of sufficient range of motion training.
- the specific part is an arm part
- the movable range of the arm part cannot be accurately evaluated. Therefore, a part other than the specific part (for example, the trunk) moves. In such a case, processing such as invalidating the evaluation may be performed.
- the exercise support system 1B further includes a standard storage unit 51 that stores standard information indicating a standard to be evaluated, and the evaluation unit 155 is configured to evaluate the range of motion using the standard information as an evaluation criterion. That is, the reference data is data indicating the range of motion of a standard specific part of a healthy person.
- the evaluation unit 155 uses the standard information indicating the evaluation target standard as an evaluation criterion and evaluates the range of motion of the specific region by comparing the evaluation object with the evaluation criterion.
- the range of motion can be evaluated relative to the standard range of motion. Therefore, in the user 2 whose range of motion is limited due to illness or injury, the range of motion training can be performed with the standard range of motion as a target.
- the evaluation unit 155 includes the area of the region through which the specific part passes in the plane along the display surface 30 as the evaluation target. That is, the evaluation data includes data indicating the area of a region through which the specific part has passed in a plane parallel to the display surface 30.
- the evaluation unit 155 includes the movable range of a specific part in a predetermined direction as an evaluation target. That is, the evaluation data includes data indicating the movable range of the specific part in the predetermined direction.
- the evaluation unit 155 evaluates the movable range of the specific part in a predetermined direction or the area of the moving region on the plane, it is not necessary to specify the position of the specific part three-dimensionally. Therefore, a two-dimensional camera such as a CCD (Charge Coupled Device) camera can be used in place of the distance image sensor 4, and there is an advantage that the processing speed of the position detection unit 152 is improved. Furthermore, if only a one-dimensional distance or the like is measured, an object detection sensor using a laser, an ultrasonic wave, or the like can be used instead of the distance image sensor 4.
- CCD Charge Coupled Device
- the position detection unit 152 detects the position of the specific part in the three-dimensional space from the output of the three-dimensional sensor, and the evaluation unit 155 determines the volume of the region through which the specific part passes in the three-dimensional space. Is included in the evaluation target. That is, the position detection unit 152 is configured to detect the position of the specific part from the output of the three-dimensional sensor.
- the evaluation data includes data indicating the volume of the region through which the specific part has passed.
- the evaluation unit 155 includes the locus of the specific part as an evaluation target. That is, the evaluation data includes data indicating the locus of the specific part.
- the evaluation unit 155 includes the volume or movement locus of the moving region of the specific part as the evaluation target, the movable range of the specific part of the user 2 including the movement in the front-rear direction can be evaluated in detail.
- the evaluation unit 155 includes the time required for the specific part for the predetermined operation as an evaluation target. That is, the evaluation data includes data indicating the time required for the user 2 to perform a predetermined operation at the specific part.
- the evaluation unit 155 includes the time required for a specific part for a predetermined operation as an evaluation target, there is an advantage that it is possible to evaluate how smoothly the specific part moves.
- storage part 51 is used as a standard memory
- the storage unit 51 may be used as the storage unit.
- the comparison targets such as the height (distance), area, volume, time, and trajectory obtained from the change in the position of the specific part of the user 2 are collected in time series for each user 2.
- the exercise support system 1B may further include a history storage unit 51 that stores the history of the evaluation target as history information in time series, and the evaluation unit 155 may evaluate the range of motion using the history information as an evaluation criterion.
- the movable range evaluation unit (evaluation unit 155) may be configured to adopt the evaluation data used for the previous evaluation of the movable range of the specific part as the reference data.
- the evaluation unit 155 evaluates the range of motion of the specific part using the history information stored in the storage unit 51 as an evaluation criterion and comparing the evaluation object with the evaluation criterion. That is, the evaluation unit 155 specifies the comparison target when the user 2 has performed the range-of-motion training in the past as an evaluation criterion and compares it with the evaluation target in the range-of-motion training currently performed by the user 2. Evaluate range of motion. Thereby, in the evaluation part 155, about the same user 2, it can evaluate how the movable range of the specific site
- the exercise support system (range of motion training system) 1C of the present embodiment is different from the exercise support system (range of motion training system) 1B of the third embodiment in that the half mirror 6 is not provided.
- an imaging device 7 is provided that is disposed in front of the user 2 and has a lens oriented in a direction in which the user 2 is imaged from the front.
- the exercise support system 1C of the present embodiment is different from the exercise support system 1B of the third embodiment in a control device 5C.
- the control device 5 ⁇ / b> C includes a storage unit 51, an acquisition unit 53, a first extraction unit (feature amount extraction unit) 54, a second extraction unit (reference amount extraction unit) 55, and an evaluation unit.
- an inversion processing unit 58 is provided in addition to the (posture evaluation unit) 56, the presentation unit 57B, the position detection unit 152, the index setting unit 153, the evaluation unit (second evaluation unit) 155, and the video generation unit 154. I have.
- the video generation unit 154 includes a determination unit 1541 and a display control unit 52B.
- the display control unit 52 ⁇ / b> B functions as an inverted image display unit that displays the horizontally inverted image generated by the inversion processing unit 58 on the display surface 30.
- the index setting unit 153 is configured to determine the position of the index so that the position of the index corresponds to the position of the specific part in the horizontally reversed image.
- the exercise support system (range of motion training system) 1C does not present a mirror image that is optically formed, and allows the user 2 to visually recognize and use the reverse video displayed on the display device 3.
- the person 2 can make the inverted image an illusion of his own mirror image.
- the video generation unit 154 displays the graphic image 131 on the display device 3 together with the reverse video generated by the reverse processing unit 58.
- the index setting unit 153 sets the position of the index so that it overlaps the video of the specific part of the inverted video on the display surface 30.
- the specific part is the entire arm part and the index is set to a position corresponding to the position of the wrist that is a part of the arm part
- the index setting unit 153 displays the arm in the reverse video on the display surface 30.
- An index is set at a position overlapping the video of the wrist of the department.
- the exercise support system 1 ⁇ / b> C of the present embodiment further includes the imaging device 7 that is disposed in front of the user 2 and captures an image of the user 2.
- the video generation unit 154 is configured to cause the display device 3 to display an inverted video obtained by horizontally inverting the user's video captured by the imaging device 7 together with a graphic image.
- the index setting unit 153 is configured to set an index at a position on the display surface 30 that overlaps the video of the specific part of the inverted video.
- the mirror image display means of the exercise support system 1C of this embodiment captures the user 2 and generates an image of the user 2, and the image of the user 2 generated by the imaging device 7 is shifted left and right.
- An inversion processing unit 58 that generates a horizontally inverted image by inversion, and a display control unit (inverted image display unit) 52B that displays the horizontally inverted image generated by the inversion processing unit 58 on the display surface 30 are configured.
- the configuration can be simplified compared to the exercise support system 1B of the third embodiment by the amount that the half mirror 6 is omitted.
- an existing display can be used as the display device 3 without newly installing a dedicated display. System introduction costs can be reduced.
- the exercise support system 1D of the present embodiment has a side as a center-of-gravity movement training system.
- the center-of-gravity movement training system is used for training of a user's center-of-gravity movement.
- One of the important motor functions of humans is the function of shifting the center of gravity.
- the function of shifting the center of gravity may be impaired in patients or elderly people who have a physical movement disorder due to illness or injury. . People who have insufficient function of center of gravity movement may not be able to move the center of gravity smoothly so that, for example, the weight is alternately applied to the left and right legs, which may interfere with basic exercise such as walking. . Therefore, center-of-gravity movement training for smoothly moving the center of gravity is widely adopted, for example, in the field of rehabilitation.
- the system described in Document 3 although the center of gravity position is fed back to the user, training to reduce the fluctuation of the center of gravity position from the correct posture can be performed, but smooth center of gravity movement necessary for walking and the like is given to the user. Training to learn is difficult.
- the system described in Document 3 can also display the movement locus of the center of gravity position, etc., but it is difficult to evaluate whether or not the center of gravity can be moved smoothly from the movement locus of the center of gravity, so that the smooth movement of the center of gravity is acquired. Is not enough to use for training.
- the center-of-gravity movement training system can perform training for allowing the user to learn smooth center-of-gravity movement.
- the exercise support system 1D of this embodiment is used as a center-of-gravity movement training system.
- the center-of-gravity movement training system is used for rehabilitation for smoothly moving the center of gravity of a patient whose function of center-of-gravity movement has declined due to illness or injury.
- the description of the following embodiment is not intended to limit the use of the center of gravity movement training system.
- the center of gravity can be used for normal exercises, training for acquiring the sense of center of gravity movement necessary for various sports, etc.
- a mobile training system may be used.
- the exercise support system (center of gravity movement training system) 1D of the present embodiment includes a display device 3 that is arranged in front of a user (patient) 2 and displays an image on a display surface 30, and a horizontal plane.
- a measuring device 8 for measuring the load distribution of the user 2 and a control device 5D for controlling the operation of the display device 3 and the like are provided. Both the display device 3 and the measuring device 8 are connected to the control device 5D.
- the exercise support system 1D of the present embodiment includes a half mirror 6 that is a mirror image display means.
- the measuring device 8 is located on the floor in front of the half mirror 6 and at the feet of the user 2.
- the measuring device 8 includes a boarding board 80 on which the user 2 is boarded, and a load sensor (not shown) that measures a load applied to each of the left and right legs of the user 2 on the boarding board 80. Yes.
- At least one load sensor is provided corresponding to each of the right leg side boarding base 80 and the left leg side boarding base 80. That is, the measuring device 8 has a working surface that receives a load from the user, and is configured to measure a load distribution on the working surface.
- the action surface is the upper surface of the boarding base 80.
- the measuring device 8 measures the load distribution in the horizontal plane of the user 2 standing on the boarding base 80 by measuring the load with each load sensor. In other words, the measuring device 8 measures the load applied to the left region and the load applied to the right region with respect to the center line in the left-right direction of the boarding base 80, respectively, and the load applied to each of the left leg and the left leg is measured. Measure distribution in real time.
- the measuring device 8 measures the load distribution of the user 2 in the horizontal plane in real time, and outputs the measurement result to the control device 5D.
- the measurement result of the measurement device 8 output to the control device 5D may be a value representing the load distribution of the user 2 in the horizontal plane.
- the measurement device 8 detects the left leg and the right of the user 2. The load applied to each of the legs is output to the control device 5D.
- the measuring device 8 may be configured to measure a load applied to only one of the left leg and the right leg of the user 2 by a load sensor. In other words, if the weight of the user 2 is given in advance as a known value, the measuring device 8 measures the load on the left leg, for example, thereby calculating the distribution of the load on the user 2 from the ratio of the load to the weight. Can be requested.
- the exercise support system 1D of the present embodiment has a calculation unit 251 in the control device 5D that calculates a balance value that represents the ratio of the left and right loads of the user 2 based on the measurement result of the measurement device 8. Further, the control device 5D generates an index image 252 for generating an index image indicating a change in the balance value accompanying the movement of the center of gravity of the user 2, and an exemplary image indicating a periodic change in the balance value serving as an example of the center of gravity movement. It has a model generation unit 253 to generate, and a storage unit 254 in which various setting values and the like are stored. The index generation unit 252 and the model generation unit 253 cause the display device 3 to display the generated index video and model video, respectively.
- the control device 5D includes a first extraction unit (feature amount extraction unit) 54, a second extraction unit (reference amount extraction unit) 55, an evaluation unit 56, and a presentation unit. 57D, storage unit 51, display control unit 52D, calculation unit 251, storage unit (second storage unit) 254, balance value display unit 2521, target value display unit 2531, and center of gravity movement evaluation unit 255 .
- the 1st extraction part 54, the 2nd extraction part 55, and the evaluation part (posture evaluation part) 56 do not necessarily need to be provided.
- the calculation unit 251 is configured to calculate a balance value that represents the ratio of the load at a predetermined location in the working surface based on the distribution of the load measured by the measuring device 8. For example, the calculation unit 251 determines, based on the measurement result of the measurement device 8, the load applied to the left side region (the load applied to the left leg) and the load applied to the right region (applied to the right leg) from the center line in the horizontal direction of the boarding base 80. A balance value representing a ratio to (load) is calculated in real time. Specifically, the calculation unit 251 calculates the ratio of the left and right loads based on the sum (that is, the weight of the user 2) based on the loads applied to the left leg and the right leg of the user 2 respectively. The balance value is calculated in real time.
- the second storage unit 254 includes a balance value storage unit 2541 and a setting data storage unit 2542.
- the balance value storage unit 2541 is configured to store the balance value calculated by the calculation unit 251.
- the setting data storage unit 2542 is configured to store setting data indicating a temporal change in the target value of the balance value.
- the setting data is defined by, for example, a sine wave having a predetermined period.
- the setting data includes a period and an amplitude value as values defining the sine wave.
- the setting data includes an exercise time that defines the length of the sine wave.
- the amplitude value is represented by exercise intensity.
- the exercise intensity indicates a ratio (percentage) of an amplitude value with respect to a preset reference value. That is, if the exercise intensity is 50%, the amplitude value is half of the reference value.
- the index generation unit 252 generates an index video based on the balance value calculated by the calculation unit 251 and causes the display device 3 to display the index video.
- the index generation unit 252 includes a balance value display unit 2521 and a display control unit 52D.
- the balance value display unit 2521 is configured to display the balance value calculated by the calculation unit 251 on the display surface 30.
- the balance value display unit 2521 generates an index video (index image) based on the balance value, and controls the display control unit 52D so that the index video is displayed on the display surface 30.
- the model generation unit 253 generates a model video according to the setting value stored in the storage unit 254 and causes the display device 3 to display the model video.
- the model generation unit 253 includes a target value display unit 2531 and a display control unit 52D.
- the target value display unit 2531 is configured to display the target value on the display surface 30 based on the setting data stored in the setting data storage unit 2542.
- the target value display unit 2521 generates a model video (model image) based on the target value (set value), and controls the display control unit 52D so that the model video is displayed on the display surface 30. .
- the index image and the model image are both moving image images representing the balance value in real time.
- video of the bar graph which represents the ratio of and with height.
- four bar graphs are displayed side by side in the left-right direction of the display surface 30, of which the two outer bar graphs are index images 231 and the two inner graphs are model images 232.
- the bar graph of the index video 231 is displayed in white and the bar graph of the model video 232 is displayed in orange so that the user 2 can easily distinguish the index video 231 and the model video 232.
- the index image 231 and the model image 232 include a pair of bar graphs corresponding to the left leg and the right leg, respectively, and 100% depending on the height of the bar graph displayed in the vertically long rectangular frame.
- the upper limit indicates the ratio of the load applied to the left and right legs to the total weight.
- the bar graph displayed at the left end of the display surface 30 corresponds to the left leg
- the bar graph displayed at the right end corresponds to the right leg
- the index image 231 and the model image 232 indicate the load applied to the left and right legs.
- the percentage is reflected in the height of each bar graph in 1% increments.
- the bar graph corresponding to the left leg gradually decreases in the index image 231 as the center of gravity moves.
- the bar graph to be changed gradually increases.
- the model image 232 is a balance that serves as an example of the movement of the center of gravity of the user 2 by changing so that the left and right bar graphs alternately become higher at a predetermined cycle regardless of the movement of the user 2. Represents a periodic change in value.
- a cycle, an exercise intensity, and an exercise time are stored in advance as set values for determining the movement of the model image 232.
- the period here represents the period of change of the balance value
- the exercise intensity represents the maximum value of the load applied to the left and right legs (that is, the maximum value of the bar graph)
- the exercise time represents the user 2 It represents the time to exercise.
- These set values are arbitrarily set from the outside using an input interface (keyboard or the like) serving as an input unit of the control device 5D, and stored in the storage unit 254 in advance.
- the exercise intensity may be set to different values for the left leg and the right leg.
- the model generation unit 253 generates a model video 232 that shows a change pattern of balance values determined by the period and exercise intensity stored in the storage unit 254, and displays the model video 232 over the exercise time. 3 is displayed.
- the model generation unit 253 generates a model video 232 that changes in a pattern such that the temporal change in the height of the bar graph is a sine wave.
- the exercise intensity 233, the period (frequency) 234, and the exercise time 235 are displayed above the index video 231 and the model video 232 on the display surface 30, respectively, and the remaining time 236 is the index video 231 and the model video. It is displayed below 232.
- the remaining time here is the time obtained by subtracting the elapsed time from the exercise time.
- the index video 231 and the model video 232 are displayed on the display device 3, so that the user 2 can follow the movement of the bar graph in the model video 232 with the index video 231.
- the center of gravity can be moved to change the ratio of the load applied to each leg. That is, the user 2 can move the center of gravity so as to change the ratio of the load applied to the left and right legs in accordance with the movement of the bar graph in the model video 232.
- the model generation unit 253 changes the display color of the bar graph, for example. You may make it notify to the user 2 by making it change.
- the exercise support 1D includes an evaluation unit (center-of-gravity movement evaluation unit) 255 that evaluates a shift in timing at which the balance value changes between the model image 232 and the index image 231, and an evaluation result of the evaluation unit 255. Is further provided in the control device 5D.
- an evaluation unit center-of-gravity movement evaluation unit 255 that evaluates a shift in timing at which the balance value changes between the model image 232 and the index image 231, and an evaluation result of the evaluation unit 255.
- the center-of-gravity movement evaluation unit 255 obtains a temporal change in the balance value from the balance value stored in the balance value storage unit 2541 and uses it based on the temporal change in the balance value and the temporal change in the target value indicated by the setting data. It is configured to evaluate the movement of the center of gravity of the person 2. That is, the evaluation unit (gravity center movement evaluation) 255 includes an index image 231 that shows a change in the balance value associated with the user 2's movement of the center of gravity, and an exemplary image 232 that shows a periodic change in the balance value that serves as an example of the movement of the center of gravity. In contrast, the timing deviation of the change in the balance value is evaluated between the two. This evaluation result represents the followability of the center of gravity movement of the actual user 2 indicated by the index image 231 with respect to the exemplary center of gravity movement indicated by the example image 232. The smaller the deviation, the higher the followability. Represents.
- the evaluation unit 255 determines the value indicated by the model video 232 and the index video 231 with respect to the ratio of the load applied to one leg (for example, the right leg) in the predetermined sampling period (for example, 100 msec). The difference from the value indicated by is calculated. The difference calculated in this way is assigned a score corresponding to its size in advance, and the evaluation unit 255 adds the score corresponding to the difference every time the difference is calculated, and finally The total score obtained in is taken as the evaluation score. The evaluation points obtained in this way are assigned such that the smaller the difference (that is, the smaller the deviation), the higher the score.
- the evaluation unit 255 includes a difference in balance value at the same timing between the model video 232 and the index video 231 as an evaluation target, and evaluates a deviation between the model video 232 and the index video 231. That is, the center-of-gravity movement evaluation unit 255 is configured to evaluate the center-of-gravity movement using the difference between the balance value and the target value at a predetermined time.
- the evaluation unit 255 for example, in the case where the model video 232 and the index video 231 match only the timing of the change of the balance value but the magnitude of the balance value is deviated, Deviation can also be evaluated.
- the evaluation unit 255 has an advantage that it is possible to strictly evaluate the deviation between the exemplary center of gravity movement indicated by the example video 232 and the actual center of gravity movement of the user 2 indicated by the index video 231. is there.
- the evaluation method performed by the evaluation unit 255 is not limited to the above-described method, and may be any method that can evaluate a shift in timing at which the balance value changes between the model image 232 and the index image 231.
- the evaluation unit 255 may obtain an evaluation score by obtaining a cumulative total of differences calculated for each predetermined sampling period and converting the obtained cumulative value into a score.
- the evaluation unit 255 can evaluate the timing shift at which the balance value changes between the model video 232 and the index video 231. That is, the evaluation unit 255 extracts, for example, a local maximum point (or local minimum point) of the ratio of the total weight of the load applied to one leg (for example, the right leg) from each of the model image 232 and the index image 231, and the time axis It can be evaluated by quantifying the deviation of the maximum point (or minimum point) in the direction.
- the presenting unit 57D is configured to present the result of evaluation by the gravity center movement evaluating unit 255.
- the presentation unit 57D presents to the user 2 the evaluation result of the timing shift between the model video 232 and the index video 231 performed by the evaluation unit 255.
- the presentation unit 57D presents the evaluation result of the evaluation unit 255 to the user 2 by voice or light.
- it is set as the structure which displays the evaluation result of the evaluation part 255 on the display apparatus 3, and the display apparatus 3 may be combined as presentation part 57D.
- the display device 3 is also used as the presentation unit 57D, it is conceivable that a message indicating the evaluation result is displayed on the display surface 30 after the training time ends.
- the content presented by the presenting unit 57D may be an evaluation point that represents the degree of deviation numerically and quantitatively, or may be a result of ranking the evaluation points in a plurality of stages. For example, when the evaluation unit 255 determines a tendency of deviation such as a large deviation when moving the center of gravity to the right leg side, the presentation unit 57D may present advice according to the determination result. .
- the control device 5D starts measuring the exercise time with a timer (not shown), and displays the index image 231 and the model image 232 on the display device 3. To display.
- the user 2 can perform a correct movement of the center of gravity by moving the center of gravity so that the movement of the bar graph of the index video 231 matches the model video 232 while watching the index video 231 and the model video 232. .
- the control device 5D ends the display of the index video 231 and the model video 232, and evaluates the timing shift between the model video 232 and the index video 231 by the evaluation unit 255, and displays it to the presentation unit 57D. To present the evaluation result to the user 2.
- the control device 5D starts counting the exercise time again and causes the display device 3 to display the index image 231 and the model image 232. .
- the exercise support system (center of gravity movement training system) 1D of the present embodiment includes the display device 3 that displays an image on the display surface 30, the control device 5D that displays an image on the display device 3, and the display surface. 30 and a measuring device 8 that is disposed at the feet of the user 2 facing the user 30 and measures the load distribution of the user 2 in a horizontal plane.
- the control device 5D calculates a balance value that represents a ratio of the left and right or front and rear loads of the user 2 based on the measurement result of the measurement device 8, and changes in the balance value accompanying the movement of the center of gravity of the user 2.
- An index generation unit 252 that generates an index video to be displayed and displays the index video on the display device 3; an exemplary model generation unit 253 that generates an exemplary video indicating a periodic change in a balance value that is an exemplary center of gravity movement and displays the exemplary video on the display device 3; An evaluation unit 255 that evaluates a shift in timing at which the balance value changes between the model image and the index image, and a presentation unit 57D that presents the evaluation result of the evaluation unit 255 are provided.
- the exercise support system 1D is measured by the measuring device 8 that has a working surface (the upper surface of the boarding base 80) that receives a load from the user 2 and that measures the load distribution on the working surface, and the measuring device 8.
- a calculation unit 251 that calculates a balance value that represents a load ratio at a predetermined location in the working surface based on the distribution of the load, a balance value storage unit 2541 that stores the balance value calculated by the calculation unit 251, and a calculation unit 251 Stored in the setting data storage unit 2542 and a setting data storage unit 2542 that stores setting data indicating temporal changes in the target value of the balance value.
- Determined temporal changes in the value includes a movement of the center of gravity evaluation unit 255 for evaluating a movement of the center of gravity of the user 2 on the basis of the temporal change of the target value indicated by the temporal change settings and the actual balance value.
- the presentation unit 57D is configured to present the result of evaluation by the gravity center movement evaluation unit 255.
- the user 2 learns the correct center-of-gravity movement exemplified in the example video 232 by moving the center of gravity so that the index video 231 is aligned with the model video 232. can do.
- training the user 2 so that the center of gravity can be moved in accordance with the periodically changing center of gravity position is directly related to learning smooth center of gravity movement. Therefore, the user 2 can perform training for learning smooth center-of-gravity movement necessary for walking or the like only by moving his / her body as if he / she is enjoying a game. That is, according to the exercise support system 1D of the present embodiment, it is possible to train the user 2 to learn smooth center of gravity movement. By performing such training, the user 2 is also trained in instantaneous power, leading to prevention of falls during walking.
- the evaluation unit 255 includes the difference in the balance value at the same timing between the model image and the index image as an evaluation target.
- the evaluation unit (center of gravity movement evaluation unit) 255 is configured to evaluate the center of gravity movement using the difference between the balance value and the target value at a predetermined time point.
- the evaluation unit 255 determines the actual user for the exemplary center-of-gravity movement indicated by the exemplary image 232 from the timing difference between the exemplary image 232 and the index image 231. 2 is evaluated, and the evaluation result is fed back to the user 2 from the presentation unit 57D. Therefore, the user 2 can fully understand the necessity and effect of the center-of-gravity movement training based on the evaluation result of whether or not the user 2 can smoothly move the center of gravity.
- the exercise support system 1D further includes a half mirror 6 that is disposed on the user 2 side with respect to the display surface 30 and transmits the image displayed on the display surface 30 and reflects the mirror image of the user 2. That is, the exercise support system 1D of the present embodiment includes the half mirror 6 disposed in front of the display surface 30 as a mirror image display unit.
- the user 2 can exercise while looking at the mirror image of the user reflected in the half mirror 6, so how the center of gravity moves when the user is in any posture. You can visually learn what to do. Therefore, the user 2 can learn the body movements necessary for moving the center of gravity, such as how to tilt the body, for example, when applying a load to the right leg during training.
- the user 2 can not only visually understand the movement of the center of gravity, but also can perform training to move the center of gravity while confirming his / her posture. For example, the user 2 can perform training to move the center of gravity while checking the inclination of the body. In addition, the user 2 can perform training to move the center of gravity while keeping the line connecting both shoulders horizontal.
- the control device 5D includes a storage unit (setting data storage unit) 2542 that stores setting values for determining the movement of the model image, and a storage unit 2542. And a setting update unit 256 that changes the set value in accordance with the evaluation result of the evaluation unit 255. That is, the control device 5 ⁇ / b> D includes a setting update unit 256 that changes the temporal change of the target value indicated by the setting data stored in the setting data storage unit 2542 according to the evaluation result by the gravity center movement evaluation unit 255.
- the model generation unit 253 changes the content of the model video 232 according to the evaluation result of the evaluation unit 255.
- the setting update unit 256 when high evaluation is obtained by the evaluation unit 255 (when the deviation is small), the setting update unit 256 shortens the cycle of the set value in order to increase the difficulty of the center of gravity movement indicated by the model video 232, Increase exercise intensity. For example, if the deviation is equal to or less than a predetermined first threshold value, the setting update unit 256 decreases the cycle by a predetermined value or increases the exercise intensity by a predetermined value. On the other hand, when a low evaluation is obtained by the evaluation unit 255 (when the deviation is large), the setting update unit 256 increases the period of the set value in order to reduce the difficulty of the center of gravity movement indicated by the model video 232, Reduce exercise intensity. For example, if the deviation is greater than or equal to a predetermined second threshold value that is greater than the first threshold value, the setting update unit 256 increases the cycle by a predetermined value or decreases the exercise intensity by a predetermined value.
- the exercise support system (center-of-gravity movement training system) 1D can allow the user 2 to receive training with a degree of difficulty that matches the ability of the user 2 to move the center of gravity, and places an excessive burden on the user 2. It is possible to make appropriate exercises without any problems.
- the bar graph image is exemplified as the index image 231 and the model image 232.
- the present invention is not limited to this example.
- the exercise support system (center of gravity movement training system) 1E of the present embodiment is different from the exercise support system (center of gravity movement training system) 1D of the fifth embodiment in that the half mirror 6 is not provided. Further, in the exercise support system 1E of the present embodiment, an imaging device 7 is provided that is disposed in front of the user 2 and has a lens directed in a direction in which the user 2 is imaged from the front.
- the exercise support system 1E of the present embodiment is different from the exercise support system 1D of the fifth embodiment in a control device 5E.
- the control device 5E includes a first extraction unit (feature amount extraction unit) 54, a second extraction unit (reference amount extraction unit) 55, an evaluation unit 56, a presentation unit 57D, and a storage unit. 51, a display control unit 52D, a calculation unit 251, a storage unit (second storage unit) 254, a balance value display unit 2521, a target value display unit 2531, and a centroid movement evaluation unit 255.
- Unit 53 and an inversion processing unit 58 is an inversion processing unit 58.
- the display control unit 52D functions as an inverted image display unit that displays the horizontally inverted image generated by the inversion processing unit 58 on the display surface 30.
- the exercise support system 1E according to the present embodiment does not present a mirror image that is optically formed, allows the user 2 to visually recognize the reverse video displayed on the display device 3, and The reverse video can be illusioned with its own mirror image. Therefore, according to the exercise support system 1E of the present embodiment, the same effect as when the half mirror 6 is provided can be obtained.
- the index generation unit 252 and the model generation unit 253 cause the display device 3 to display the index video 231 and the model video 232 together with the reverse video generated by the reverse processing unit 58.
- the inverted video may be displayed so as to overlap with the index video 231 and the model video 232, but in this case, the inverted video is preferably displayed as a semi-transparent video (for example, a transmittance of 50%).
- the mirror image display means of the exercise support system 1E of the present embodiment captures the user 2 and generates the image of the user 2, and the user 2 generated by the imaging device 7. And a display control 52D for displaying the horizontally inverted image generated by the inversion processing unit 58 on the display surface 30.
- the configuration can be simplified as compared with the exercise support system 1D of the fifth embodiment, as much as the half mirror 6 is omitted.
- an existing display can be used as the display device 3 without newly installing a dedicated display. System introduction costs can be reduced.
- the gravity center movement of the user 2 is presented to the user 2 in the index image 231, it is not essential that the user 2 can exercise while looking at his / her own image (mirror image), and an inverted image is displayed.
- the function may be omitted.
- the calculation part 251 calculates
- the balance value may be a ratio of loads before and after the user 2.
- the exercise support system (center-of-gravity movement training system) 1E can be used for training for center-of-gravity movement that alternately moves the center-of-gravity position in the front-rear direction.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Business, Economics & Management (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physiology (AREA)
- Entrepreneurship & Innovation (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Radiology & Medical Imaging (AREA)
- Physical Education & Sports Medicine (AREA)
- Geometry (AREA)
- Rehabilitation Tools (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
本発明の運動支援システムの第5の形態は、第1~第3の形態のうちいずれか1つにおいて、さらに、前記所定の運動を行う前記利用者を撮影して前記所定の運動を行う前記利用者の画像である記録画像を生成する撮像装置と、基準量抽出部と、を備える。前記比較画像表示部は、前記撮像装置で生成された前記記録画像を前記比較画像として前記表示面に表示するように構成される。前記基準量抽出部は、前記記録画像から前記利用者の身体における前記所定の標本点の位置を検出し当該標本点の位置に基づいて前記利用者の姿勢を表す特徴量を前記基準量として求めるように構成される。
本実施形態の運動支援システム1は、病気や怪我により四肢に障害が残った患者に所定の運動を行わせることにより、四肢の機能の回復を図ることを目的とするリハビリテーションに使用される。ただし、以下の実施形態の記載は運動支援システムの用途を限定する趣旨ではなく、たとえば健常者がヨガやダンス等の運動に運動支援システムを用いてもよい。なお、以下では、利用者が立った姿勢で運動支援システムを利用する例を示すが、これに限らず、利用者は椅子などに座った姿勢で運動支援システムを利用してもよい。
本実施形態の運動支援システム1Aは、ハーフミラー6を備えていない点が実施形態1の運動支援システム1と相違する。また、本実施形態の運動支援システム1Aでは、図10に示すように、利用者2の前方に配置され利用者2を前方から撮像する向きにレンズが向けられた撮像装置7が設けられている。
本実施形態の運動支援システム1Bは、可動域訓練システムとしての側面を有する。可動域訓練システムは、四肢等の身体の特定部位の可動域を正常範囲に戻したり正常範囲に維持したりするための可動域訓練に用いられる。
本実施形態の運動支援システム(可動域訓練システム)1Cは、図16に示すように、ハーフミラー6を備えていない点が実施形態3の運動支援システム(可動域訓練システム)1Bと相違する。また、本実施形態の運動支援システム1Cでは、利用者2の前方に配置され利用者2を前方から撮像する向きにレンズが向けられた撮像装置7が設けられている。
本実施形態の運動支援システム1Dは、重心移動訓練システムとしての側面を有する。重心移動訓練システムは、利用者の重心移動の訓練に用いられる。
本実施形態の運動支援システム(重心移動訓練システム)1Eは、ハーフミラー6を備えていない点が実施形態5の運動支援システム(重心移動訓練システム)1Dと相違する。また、本実施形態の運動支援システム1Eでは、利用者2の前方に配置され利用者2を前方から撮像する向きにレンズが向けられた撮像装置7が設けられている。
Claims (29)
- 利用者に画像を表示する表示面を有する表示装置と、
所定の運動を行う運動者の画像である比較画像を記憶する比較画像記憶部と、
前記比較画像記憶部に記憶された前記比較画像を前記表示面に表示する比較画像表示部と、
前記利用者の鏡像を前記比較画像に重ねて表示する鏡像表示手段と、
前記利用者の身体における所定の標本点の位置を検出し当該標本点の位置に基づいて前記利用者の姿勢を表す特徴量を求める特徴量抽出部と、
前記特徴量抽出部で求められた前記特徴量を前記運動者の姿勢を表す基準量と比較して、前記利用者の姿勢の前記運動者の姿勢からのずれの評価を行う姿勢評価部と、
前記姿勢評価部による前記評価の結果を提示する提示部と、
を備える
ことを特徴とする運動支援システム。 - 前記鏡像表示手段は、前記表示装置の前記前面に配置されるハーフミラーである
ことを特徴とする請求項1記載の運動支援システム。 - 前記鏡像表示手段は、
前記利用者を撮影して前記利用者の画像を生成する撮像装置と、
前記撮像装置で生成された前記利用者の画像を左右反転させて左右反転画像を生成する反転処理部と、
前記反転処理部で生成された前記左右反転画像を前記表示面に表示する反転画像表示部と、
を備える
ことを特徴とする請求項1記載の運動支援システム。 - 前記所定の運動は、前記利用者が行う運動の模範となる運動である
ことを特徴とする請求項1~3のうちいずれか1項記載の運動支援システム。 - 前記所定の運動を行う前記利用者を撮影して前記所定の運動を行う前記利用者の画像である記録画像を生成する撮像装置と、
基準量抽出部と、
を備え、
前記比較画像表示部は、前記撮像装置で生成された前記記録画像を前記比較画像として前記表示面に表示するように構成され、
前記基準量抽出部は、前記記録画像から前記利用者の身体における前記所定の標本点の位置を検出し当該標本点の位置に基づいて前記利用者の姿勢を表す特徴量を前記基準量として求めるように構成される
ことを特徴とする請求項1~3のうちいずれか1項記載の運動支援システム。 - 前記特徴量抽出部は、前記利用者の身体における複数の前記標本点の位置を検出し、前記複数の標本点の位置に基づいて前記利用者の身体を示す身体モデルを作成し、前記身体モデルに基づいて前記特徴量を求めるように構成される
ことを特徴とする請求項1~5のうちいずれか1項記載の運動支援システム。 - 前記特徴量は、前記複数の前記標本点から選択される2つの前記標本点同士を結ぶ直線と所定の基準線との間の角度である
ことを特徴とする請求項6記載の運動支援システム。 - 前記特徴量は、前記利用者の身体の傾きである
ことを特徴とする請求項1~5のうちいずれか1項記載の運動支援システム。 - 前記利用者の身体の傾きは、前記利用者の肩の傾きであり、
前記特徴量抽出部は、前記利用者の右上肢の標本点と左上肢の標本点との位置を検出し、前記右上肢の前記標本点と前記左上肢の前記標本点とを結ぶ直線と水平線との間の角度を、前記利用者の肩の傾きとして算出するように構成される
ことを特徴とする請求項8記載の運動支援システム。 - 前記利用者の身体の傾きは、前記利用者の体幹の傾きであり、
前記特徴量抽出部は、前記利用者の頭部の標本点と腰部の標本点との位置を検出し、前記頭部の前記標本点と前記腰部の前記標本点とを結ぶ直線と垂直線との間の角度を、前記利用者の体幹の傾きとして算出するように構成される
ことを特徴とする請求項8記載の運動支援システム。 - 前記特徴量は、前記利用者の身体の特定部位の可動域である
ことを特徴とする請求項1~5のうちいずれか1項記載の運動支援システム。 - 前記特定部位は、前記利用者の上肢であり、
前記特徴量抽出部は、前記利用者の前記上肢の標本点と前記上肢に対応する肩の標本点との位置を検出し、前記上肢の前記標本点と前記肩の前記標本点とを結ぶ直線と垂直線との間の角度を前記上肢の可動域として算出するように構成される
ことを特徴とする請求項11記載の運動支援システム。 - 前記標本点の数は、少なくとも3つ以上であり、
前記特徴量は、前記複数の前記標本点により規定される領域の面積である
ことを特徴とする請求項6記載の運動支援システム。 - 前記比較画像表示部は、前記比較画像が前記利用者の鏡像と重なるように、前記表示面における前記比較画像の位置および大きさを調節するように構成される
ことを特徴とする請求項1~13のうちいずれか1項記載の運動支援システム。 - 前記姿勢評価部は、前記特徴量と前記基準量の差分を示す数値を求めるように構成され、
前記提示部は、前記姿勢評価部で求められた前記数値を提示するように構成される
ことを特徴とする請求項1記載の運動支援システム。 - 前記利用者の身体の特定部位の位置を検出する位置検出部と、
前記位置検出部で検出された前記位置に基づいて前記表示面における指標の位置を決定する指標設定部と、
前記表示面における所定位置に前記指標が位置しているか否かを判定する判定部と、
前記判定部が前記所定位置に前記指標が位置していると判定すると、前記所定位置に所定のイベント画像を表示するイベント画像表示部と、
前記位置検出部で検出された前記位置に基づいて、前記特定部位の可動域を示す評価データを作成する評価データ作成部と、
前記評価データ作成部で作成された前記評価データと基準データとの比較に基づいて、前記特定部位の可動域の評価を行う可動域評価部と、を備え、
前記提示部は、前記可動域評価部による前記評価の結果を提示するように構成される
ことを特徴とする請求項1記載の運動支援システム。 - 前記可動域評価部は、前回の前記特定部位の可動域の評価に用いられた前記評価データを、前記基準データとして採用するように構成される
ことを特徴とする請求項16記載の運動支援システム。 - 前記基準データは、健常者の標準的な前記特定部位の可動域を示すデータである
ことを特徴とする請求項16記載の運動支援システム。 - 前記評価データは、前記表示面に平行な面内において前記特定部位が通過した領域の面積を示すデータを含む
ことを特徴とする請求項16~18のうちいずれか1項記載の運動支援システム。 - 前記評価データは、所定方向における前記特定部位の可動範囲を示すデータを含む
ことを特徴とする請求項16~19のうちいずれか1項記載の運動支援システム。 - 前記評価データは、前記利用者が前記特定部位で所定の動作を行うのに要した時間を示すデータを含む
ことを特徴とする請求項16~20のうちいずれか1項記載の運動支援システム。 - 前記位置検出部は、三次元センサの出力から前記特定部位の位置を検出するように構成され、
前記評価データは、前記特定部位が通過した領域の体積を示すデータを含む
ことを特徴とする請求項16~21のうちいずれか1項記載の運動支援システム。 - 前記評価データは、前記特定部位の軌跡を示すデータを含む
ことを特徴とする請求項16~22のうちいずれか1項記載の運動支援システム。 - 前記鏡像表示手段は、前記表示装置の前記前面に配置されるハーフミラーであり、
前記指標設定部は、前記ハーフミラーに映った前記特定部位と重なる前記表示面の位置に前記指標の位置が対応するように、前記指標の位置を決定するように構成される
ことを特徴とする請求項16~23のうちいずれか1項記載の運動支援システム。 - 前記鏡像表示手段は、
前記利用者を撮影して前記利用者の画像を生成する撮像装置と、
前記撮像装置で生成された前記利用者の画像を左右反転させて左右反転画像を生成する反転処理部と、
前記反転処理部で生成された前記左右反転画像を前記表示面に表示する反転画像表示部と、
を備え、
前記指標設定部は、前記左右反転画像における前記特定部位の位置に前記指標の位置が対応するように、前記指標の位置を決定するように構成される
ことを特徴とする請求項16~23のうちいずれか1項記載の運動支援システム。 - 前記利用者から荷重を受ける作用面を有し、前記作用面における前記荷重の分布を測定する測定装置と、
前記測定装置で測定された前記荷重の分布に基づいて前記作用面内の所定箇所における荷重の比率を表すバランス値を算出する算出部と、
前記算出部で算出された前記バランス値を記憶するバランス値記憶部と、
前記算出部で算出された前記バランス値を前記表示面に表示するバランス値表示部と、
前記バランス値の目標値の時間的変化を示す設定データを記憶する設定データ記憶部と、
前記設定データ記憶部に記憶された前記設定データに基づいて前記目標値を前記表示面に表示する目標値表示部と、
前記バランス値記憶部に記憶された前記バランス値から前記バランス値の時間的変化を求め、前記バランス値の時間的変化と前記設定データが示す前記目標値の時間的変化とに基づいて前記利用者の重心移動の評価を行う重心移動評価部と、
を備え、
前記提示部は、前記重心移動評価部による前記評価の結果を提示するように構成される
ことを特徴とする請求項1記載の運動支援システム。 - 前記重心移動評価部は、所定の時点における前記バランス値と前記目標値との差を用いて前記重心移動の評価を行うように構成される
ことを特徴とする請求項26記載の運動支援システム。 - 前記重心移動評価部による前記評価の結果に応じて前記設定データ記憶部に記憶される前記設定データが示す前記目標値の時間的変化を変更する設定更新部を備える
ことを特徴とする請求項26または27に記載の運動支援システム。 - 前記鏡像表示手段は、前記表示面の前方に配置されるハーフミラーである
ことを特徴とする請求項26~28のうちいずれか1項記載の運動支援システム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/822,828 US20130171601A1 (en) | 2010-09-22 | 2011-09-22 | Exercise assisting system |
JP2012535074A JP5624625B2 (ja) | 2010-09-22 | 2011-09-22 | 運動支援システム |
CN201180045458.3A CN103118647B (zh) | 2010-09-22 | 2011-09-22 | 运动支援系统 |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010212732A JP5552010B2 (ja) | 2010-09-22 | 2010-09-22 | 可動域訓練システム |
JP2010212733 | 2010-09-22 | ||
JP2010-212733 | 2010-09-22 | ||
JP2010-212732 | 2010-09-22 | ||
JP2010216221A JP5597079B2 (ja) | 2010-09-27 | 2010-09-27 | 重心移動訓練システム |
JP2010-216221 | 2010-09-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012039467A1 true WO2012039467A1 (ja) | 2012-03-29 |
Family
ID=45873950
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/071667 WO2012039467A1 (ja) | 2010-09-22 | 2011-09-22 | 運動支援システム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20130171601A1 (ja) |
CN (1) | CN103118647B (ja) |
WO (1) | WO2012039467A1 (ja) |
Cited By (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014057800A (ja) * | 2012-09-19 | 2014-04-03 | Nagasakiken Koritsu Daigaku Hojin | 動作評価支援装置および動作評価支援方法 |
JP2014068714A (ja) * | 2012-09-28 | 2014-04-21 | Kitasato Institute | 関節角度測定システム |
JP2014128409A (ja) * | 2012-12-28 | 2014-07-10 | Brother Ind Ltd | 情報処理装置、情報処理方法及びプログラム |
WO2014112645A1 (ja) * | 2013-01-21 | 2014-07-24 | 株式会社東芝 | 動作情報表示装置及びプログラム |
JP2014142695A (ja) * | 2013-01-22 | 2014-08-07 | Ricoh Co Ltd | 情報処理装置、システム、画像投影装置、情報処理方法およびプログラム |
JP2014188146A (ja) * | 2013-03-27 | 2014-10-06 | Nippon Telegraph & Telephone East Corp | 運動姿勢評価装置、運動姿勢評価方法及びコンピュータプログラム |
WO2014162788A1 (ja) * | 2013-04-02 | 2014-10-09 | Necソリューションイノベータ株式会社 | 顔表情採点装置、ダンス採点装置、カラオケ装置、およびゲーム装置 |
WO2014162787A1 (ja) * | 2013-04-02 | 2014-10-09 | Necソリューションイノベータ株式会社 | 身体動作採点装置、ダンス採点装置、カラオケ装置及びゲーム装置 |
JP2014240800A (ja) * | 2013-06-12 | 2014-12-25 | 株式会社ブリヂストン | 検査補助装置 |
US20150003687A1 (en) * | 2013-07-01 | 2015-01-01 | Kabushiki Kaisha Toshiba | Motion information processing apparatus |
JP2015097639A (ja) * | 2013-11-19 | 2015-05-28 | 日本コントロールシステム株式会社 | カラオケ装置、ダンス採点方法、およびプログラム |
JP2015198834A (ja) * | 2014-04-09 | 2015-11-12 | 黄富表Huang Fubiao | MicrosoftKinect(登録商標)による脳血管障害者の回復段階の判定 |
KR20160114996A (ko) * | 2015-03-25 | 2016-10-06 | 한국전자통신연구원 | 사용자 맞춤형 운동 서비스 제공 방법 및 장치 |
JP2016174637A (ja) * | 2015-03-18 | 2016-10-06 | 株式会社タイトー | ダンス装置 |
JP2017004464A (ja) * | 2015-06-16 | 2017-01-05 | 株式会社東芝 | 画像処理装置、画像処理システム、画像処理方法及びプログラム |
WO2017017737A1 (ja) * | 2015-07-24 | 2017-02-02 | 富士通株式会社 | 表示方法、モニタ結果出力方法、情報処理装置および表示プログラム |
JP2017060572A (ja) * | 2015-09-24 | 2017-03-30 | パナソニックIpマネジメント株式会社 | 機能訓練装置 |
JP2017074350A (ja) * | 2014-12-25 | 2017-04-20 | ダンロップスポーツ株式会社 | スイング解析装置、方法及びプログラム |
KR101728775B1 (ko) * | 2015-01-28 | 2017-04-21 | 경북대학교 산학협력단 | 하프미러에 기반한 초점 및 시점 일치화 디스플레이 장치 및 방법, 그 방법을 수행하기 위한 기록 매체 |
JP2017077403A (ja) * | 2015-10-21 | 2017-04-27 | 国立大学法人 筑波大学 | 評価情報提供システムおよび評価情報提供方法 |
JP2017221686A (ja) * | 2012-06-04 | 2017-12-21 | ナイキ イノベイト シーブイ | フィットネスサブスコアとアスレチックサブスコアを含む組み合わせスコア |
WO2018168756A1 (ja) * | 2017-03-15 | 2018-09-20 | 本田技研工業株式会社 | 歩行支援システム、歩行支援方法、及びプログラム |
KR20180107015A (ko) * | 2017-03-21 | 2018-10-01 | 성균관대학교산학협력단 | 보행 보조기의 위험 상황 판단 방법 및 장치 |
JP2018171523A (ja) * | 2014-07-03 | 2018-11-08 | 帝人ファーマ株式会社 | リハビリテーション支援装置及びリハビリテーション支援装置の制御プログラム |
JP2019008393A (ja) * | 2017-06-21 | 2019-01-17 | FunLife株式会社 | センサ、ハーフミラー及びディスプレイ装置を備えた装置 |
JP2019024550A (ja) * | 2017-07-25 | 2019-02-21 | 株式会社クオンタム | 検出装置、検出システム、処理装置、検出方法、及び検出プログラム |
KR101959079B1 (ko) * | 2018-10-08 | 2019-03-18 | 주식회사 마이베네핏 | 신체 측정 및 평가 방법 |
JP2019118783A (ja) * | 2018-01-10 | 2019-07-22 | ユインケア コーポレーション | 遠隔リハビリ分析装置およびその方法 |
US10366617B2 (en) | 2015-09-08 | 2019-07-30 | Toyota Jidosha Kabushiki Kaisha | Walking training apparatus and walking training method |
WO2019172298A1 (ja) * | 2018-03-06 | 2019-09-12 | 株式会社Mtg | 運動ブースの設置構造および方法 |
JP2019150545A (ja) * | 2018-03-06 | 2019-09-12 | 株式会社 Mtg | 運動ブースの設置構造および方法 |
US10420982B2 (en) | 2010-12-13 | 2019-09-24 | Nike, Inc. | Fitness training system with energy expenditure calculation that uses a form factor |
JP2019205820A (ja) * | 2019-03-15 | 2019-12-05 | 株式会社コナミスポーツライフ | 運動指示装置、プログラム、トレーニングシステム、及び運動器具 |
US10583328B2 (en) | 2010-11-05 | 2020-03-10 | Nike, Inc. | Method and system for automated personal training |
JP2020098291A (ja) * | 2018-12-19 | 2020-06-25 | カシオ計算機株式会社 | 表示装置、表示方法およびプログラム |
KR102127355B1 (ko) * | 2018-12-28 | 2020-06-29 | 전북대학교산학협력단 | 멀티형 슬링 운동 장치 및 시스템 |
US10825561B2 (en) | 2011-11-07 | 2020-11-03 | Nike, Inc. | User interface for remote joint workout session |
JP6811349B1 (ja) * | 2020-03-31 | 2021-01-13 | 株式会社三菱ケミカルホールディングス | 情報処理装置、方法、プログラム |
WO2021149918A1 (ko) * | 2020-01-23 | 2021-07-29 | 고려대학교 산학협력단 | 골 연령 추정 방법 및 장치 |
JP2022023867A (ja) * | 2018-05-29 | 2022-02-08 | キュリアサー プロダクツ インコーポレイテッド | 対話型トレーニング及びデモンストレーション用の反射ビデオディスプレイ機器及びその使用方法 |
WO2022054366A1 (ja) * | 2020-09-09 | 2022-03-17 | 高木 りか | 姿勢評価プログラム、姿勢評価装置、姿勢評価方法、及び姿勢評価システム |
CN114267220A (zh) * | 2021-12-27 | 2022-04-01 | 林华 | 一种外科手术教学模拟方法及系统 |
WO2022075116A1 (ja) * | 2020-10-06 | 2022-04-14 | 国立研究開発法人産業技術総合研究所 | 情報表示装置及び情報表示方法 |
JP7150387B1 (ja) | 2022-01-13 | 2022-10-11 | 三菱ケミカルグループ株式会社 | プログラム、方法、および電子機器 |
JP7201850B1 (ja) | 2022-01-24 | 2023-01-10 | 三菱ケミカルグループ株式会社 | 情報処理装置、方法、およびプログラム |
JP2023008347A (ja) * | 2021-07-05 | 2023-01-19 | オプティメース株式会社 | 装着デバイス及び身体動作支援システム |
WO2023084965A1 (ja) * | 2021-11-10 | 2023-05-19 | 株式会社Nttドコモ | 映像作成装置、映像作成方法、およびプログラム |
WO2023188280A1 (ja) * | 2022-03-31 | 2023-10-05 | 日本電気株式会社 | 動作情報生成装置、動作情報生成システム、動作情報生成方法、及び記録媒体 |
US11842027B2 (en) | 2021-03-17 | 2023-12-12 | Samsung Electronics Co., Ltd. | Electronic device and controlling method of electronic device |
KR102619887B1 (ko) * | 2023-03-17 | 2024-01-02 | 한연오 | 신체활동 챌린지 서비스를 지원하는 시스템 및 방법 |
JP7620372B1 (ja) | 2023-04-18 | 2025-01-23 | 株式会社Orgo | 運動評価装置、運動評価方法、及び運動評価プログラム |
Families Citing this family (79)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9171384B2 (en) * | 2011-11-08 | 2015-10-27 | Qualcomm Incorporated | Hands-free augmented reality for wireless communication devices |
JP6045139B2 (ja) * | 2011-12-01 | 2016-12-14 | キヤノン株式会社 | 映像生成装置、映像生成方法及びプログラム |
WO2014042121A1 (ja) * | 2012-09-12 | 2014-03-20 | 独立行政法人産業技術総合研究所 | 動作評価装置及びそのプログラム |
US11083344B2 (en) | 2012-10-11 | 2021-08-10 | Roman Tsibulevskiy | Partition technologies |
US9829311B1 (en) | 2012-12-22 | 2017-11-28 | Bertec Corporation | Force measurement system |
US10803990B1 (en) | 2012-12-22 | 2020-10-13 | Bertec Corporation | Measurement and testing system that includes a data processing device configured to synchronize a first plurality of data values with a second plurality of data values by determining which first timestamps correspond to which second timestamps and to interpolate values missing from the second values |
US10331324B1 (en) | 2012-12-22 | 2019-06-25 | Bertec Corporation | Measurement and testing system |
US9200897B1 (en) * | 2012-12-22 | 2015-12-01 | Bertec Corporation | Measurement and testing system |
US11705244B1 (en) | 2012-12-22 | 2023-07-18 | Bertec Corporation | Force and/or motion measurement system that includes at least one camera and at least one data processing device configured to execute computer executable instructions for determining a position and/or movement |
US20190068589A1 (en) * | 2013-01-09 | 2019-02-28 | Chris Outwater | Range of Motion Tracking System |
US20140255890A1 (en) * | 2013-03-07 | 2014-09-11 | Hill-Rom Services, Inc. | Patient support apparatus with physical therapy system |
BR112015022240A2 (pt) * | 2013-03-13 | 2017-07-18 | Witt James | método e aparelho para o ensino de movimento repetitivo cinestésico |
CN105246449A (zh) * | 2013-05-28 | 2016-01-13 | 富士机械制造株式会社 | 护理机器人 |
WO2015011898A1 (ja) * | 2013-07-22 | 2015-01-29 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 情報処理装置および情報処理装置の制御方法 |
WO2015058388A1 (zh) * | 2013-10-24 | 2015-04-30 | 华为终端有限公司 | 一种显示图像的方法和装置 |
EP2926724A1 (en) | 2014-03-31 | 2015-10-07 | Hocoma AG | Method and system for an assessment of a movement of a limb-related point in a predetermined 3D space |
CN105301771B (zh) * | 2014-06-06 | 2020-06-09 | 精工爱普生株式会社 | 头部佩戴型显示装置、检测装置、控制方法以及计算机程序 |
CN105208422A (zh) * | 2014-06-26 | 2015-12-30 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
KR102322034B1 (ko) * | 2014-09-26 | 2021-11-04 | 삼성전자주식회사 | 전환 거울을 구비한 장치의 이미지 디스플레이 방법 및 그 장치 |
US20160093081A1 (en) * | 2014-09-26 | 2016-03-31 | Samsung Electronics Co., Ltd. | Image display method performed by device including switchable mirror and the device |
WO2016056449A1 (ja) * | 2014-10-10 | 2016-04-14 | 富士通株式会社 | スキル判定プログラム、スキル判定方法、スキル判定装置およびサーバ |
JP6452440B2 (ja) * | 2014-12-26 | 2019-01-16 | 任天堂株式会社 | 画像表示システム、画像表示装置、画像表示方法、およびプログラム |
CN104581079A (zh) * | 2015-01-21 | 2015-04-29 | 山东大学 | 一种辅助用户使用健身器材的装置和方法 |
CN105117007B (zh) * | 2015-08-20 | 2019-02-12 | 小米科技有限责任公司 | 显示设备的控制方法、装置及智能垫体 |
JP2017045160A (ja) * | 2015-08-25 | 2017-03-02 | ルネサスエレクトロニクス株式会社 | 技能指導検証システムおよび技能指導検証プログラム |
US20170177833A1 (en) * | 2015-12-22 | 2017-06-22 | Intel Corporation | Smart placement of devices for implicit triggering of feedbacks relating to users' physical activities |
CN105788375B (zh) * | 2016-05-10 | 2018-09-21 | 山东交通学院 | 一种多功能体育运动教学仪 |
JP6377674B2 (ja) * | 2016-06-08 | 2018-08-22 | パラマウントベッド株式会社 | リハビリテーション支援制御装置及びコンピュータプログラム |
JP6230084B1 (ja) * | 2016-07-08 | 2017-11-15 | 株式会社ReTech | 姿勢評価システム |
JP6436138B2 (ja) | 2016-08-23 | 2018-12-12 | トヨタ自動車株式会社 | 倒立振子型移動体、および足関節トルク推定方法 |
IT201600105377A1 (it) * | 2016-10-20 | 2018-04-20 | Bologna Isokinetic S R L | Impianto di analisi e trattamento particolarmente per uso medico, diagnostico, sportivo e riabilitativo |
CN106344036A (zh) * | 2016-10-31 | 2017-01-25 | 广州大学 | 一种检测人体运动姿势的智能运动背心装置及其检测方法 |
TWI674554B (zh) * | 2016-11-03 | 2019-10-11 | 財團法人工業技術研究院 | 動作評估方法與系統 |
JP6524991B2 (ja) | 2016-12-09 | 2019-06-05 | トヨタ自動車株式会社 | 訓練システム、および足関節トルク推定方法 |
CN106791317A (zh) * | 2016-12-30 | 2017-05-31 | 天津航正科技有限公司 | 一种人体运动的运动图检索装置 |
DE102017102144A1 (de) * | 2017-02-03 | 2018-08-09 | Stecnius UG (haftungsbeschränkt) | Trainingsvorrichtung sowie Verfahren zur Evaluierung von Bewegungsabläufen |
CN106861167B (zh) * | 2017-03-08 | 2019-05-28 | 深圳国微技术有限公司 | 一种形体矫正方法及形体矫正设备 |
CA3048542C (en) * | 2017-04-26 | 2019-12-17 | Savvy Knowledge Corporation | System for peer-to-peer, self-directed or consensus human motion capture, motion characterization, and software-augmented motion evaluation |
JP6813086B2 (ja) * | 2017-05-15 | 2021-01-13 | 富士通株式会社 | 演技表示プログラム、演技表示方法および演技表示装置 |
JP6871379B2 (ja) * | 2017-07-07 | 2021-05-12 | りか 高木 | 治療及び/又は運動の指導プロセス管理システム、治療及び/又は運動の指導プロセス管理のためのプログラム、コンピュータ装置、並びに方法 |
DE102017116558B4 (de) * | 2017-07-21 | 2023-08-10 | Milon Industries Gmbh | Verfahren zur Führung von Bewegungsabläufen sowie Trainingsvorrichtung zur Führung von Bewegungsabläufen |
TWI646996B (zh) * | 2017-10-20 | 2019-01-11 | 亞東技術學院 | 復健系統及其方法 |
JP6542407B1 (ja) | 2018-02-16 | 2019-07-10 | 株式会社東芝 | 読取システム、読取方法、プログラム、及び記憶媒体 |
US10507360B2 (en) * | 2018-02-21 | 2019-12-17 | William Schroeder | Posture correction and weight balance apparatus |
US11990219B1 (en) * | 2018-05-01 | 2024-05-21 | Augment Therapy, LLC | Augmented therapy |
CN109274883B (zh) * | 2018-07-24 | 2022-02-01 | 广州虎牙信息科技有限公司 | 姿态矫正方法、装置、终端及存储介质 |
US11557215B2 (en) * | 2018-08-07 | 2023-01-17 | Physera, Inc. | Classification of musculoskeletal form using machine learning model |
WO2020033548A2 (en) * | 2018-08-07 | 2020-02-13 | Interactive Strength, Inc. | Interactive exercise machine data architecture |
WO2020049692A2 (ja) * | 2018-09-06 | 2020-03-12 | 株式会社ソニー・インタラクティブエンタテインメント | 推定装置、学習装置、推定方法、学習方法及びプログラム |
CN109254664B (zh) * | 2018-09-20 | 2022-04-29 | 鎏玥(上海)科技有限公司 | 一种实时追踪人体镜中影像的系统 |
CN108939516A (zh) * | 2018-09-20 | 2018-12-07 | 鎏玥(上海)科技有限公司 | 一种无人智能运动器械 |
CN110941977A (zh) * | 2018-09-25 | 2020-03-31 | Oppo广东移动通信有限公司 | 图像处理方法、装置、存储介质及电子设备 |
EP3867896B1 (de) * | 2018-10-17 | 2023-05-31 | Sphery AG | Trainingsmodul |
CN109701229A (zh) * | 2019-01-28 | 2019-05-03 | 重庆勤鸟圈科技有限公司 | 用户健身运动完成度评估系统 |
JP2020129018A (ja) * | 2019-02-07 | 2020-08-27 | 株式会社日立製作所 | 動作評価システムおよび方法 |
CN109887201B (zh) * | 2019-03-05 | 2021-10-08 | 江苏中创供应链服务有限公司 | 一种无人超市自动收银用人脸支付系统 |
WO2020232139A1 (en) * | 2019-05-13 | 2020-11-19 | Hole-In-One Media, Inc. | Autonomous activity monitoring system and method |
FR3097395B1 (fr) * | 2019-06-15 | 2022-01-21 | Mathilde Amoros | Système de mur-miroir multimédia pour salle d’exercice physique |
CN110314336A (zh) * | 2019-08-05 | 2019-10-11 | 南京信息工程大学 | 一种瑜伽智能脊椎延伸监测仪及其使用方法 |
KR102323328B1 (ko) * | 2019-09-17 | 2021-11-09 | 주식회사 날마다자라는아이 | 스마트 체중계를 이용한 어린이의 성장 상태 측정 시스템 |
CN113051973A (zh) * | 2019-12-27 | 2021-06-29 | 青岛海尔多媒体有限公司 | 用于姿势矫正的方法及装置、电子设备 |
CN113129786A (zh) * | 2019-12-31 | 2021-07-16 | 沸腾时刻智能科技(深圳)有限公司 | 电子设备及系统 |
CN113076780B (zh) * | 2020-01-03 | 2022-05-31 | 乔山健身器材(上海)有限公司 | 健身用智能镜子 |
US11298578B2 (en) | 2020-01-31 | 2022-04-12 | Interactive Strength, Inc. | Positionable arm with quick release for an interactive exercise machine |
US20230154023A1 (en) * | 2020-02-18 | 2023-05-18 | University Of Miyazaki | Weight estimation device and program |
KR20210128539A (ko) * | 2020-04-16 | 2021-10-27 | 현대자동차주식회사 | 운동 정보 관리 시스템 및 그 제어 방법 |
CA3176608A1 (en) * | 2020-04-30 | 2021-11-04 | Curiouser Products Inc. | Reflective video display apparatus for interactive training and demonstration and methods of using same |
CN114078190B (zh) * | 2020-08-19 | 2024-05-31 | 乔山健身器材(上海)有限公司 | 健身运动的引导设备 |
US11682237B1 (en) * | 2020-09-04 | 2023-06-20 | Amazon Technologies, Inc. | System for learning and tracking performance of activities |
US11167172B1 (en) | 2020-09-04 | 2021-11-09 | Curiouser Products Inc. | Video rebroadcasting with multiplexed communications and display via smart mirrors |
US20220277663A1 (en) * | 2021-02-26 | 2022-09-01 | Justin A Tehrani | Guided Learning Systems, Devices, and Methods |
CN115188064A (zh) * | 2021-04-07 | 2022-10-14 | 华为技术有限公司 | 一种运动指导信息的确定方法、电子设备和运动指导系统 |
US12179064B2 (en) * | 2021-04-11 | 2024-12-31 | Vikas Khurana | System, apparatus and method for training a subject |
US20220335849A1 (en) * | 2021-04-20 | 2022-10-20 | ProMentor, Inc. | Digital video sharing, analysis, and aggregation |
CN113378692B (zh) * | 2021-06-08 | 2023-09-15 | 杭州萤石软件有限公司 | 一种降低跌倒行为误检的方法、检测系统 |
CN113935921B (zh) * | 2021-10-19 | 2024-06-04 | 成都拟合未来科技有限公司 | 一种镜面式健身信息交互方法及系统 |
CN114939216B (zh) * | 2022-05-30 | 2023-11-10 | 深圳英鸿骏智能科技有限公司 | 一种辅助康复运动的设备与方法 |
FR3136577B1 (fr) | 2022-06-13 | 2024-12-13 | LSTGmove | Procédé de reproduction d’un mouvement de référence par une personne et dispositif pour sa mise en œuvre |
CN115624724B (zh) * | 2022-11-01 | 2025-03-18 | 上海工程技术大学 | 一种基于虚拟现实的上肢康复训练系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0956697A (ja) * | 1995-08-21 | 1997-03-04 | Matsushita Electric Ind Co Ltd | リハビリテーション支援装置 |
JPH09120464A (ja) * | 1995-08-21 | 1997-05-06 | Matsushita Electric Ind Co Ltd | リハビリテーション支援装置 |
JP2009201799A (ja) * | 2008-02-28 | 2009-09-10 | Xing Inc | 運動支援装置、運動支援方法及びコンピュータプログラム |
JP2009277195A (ja) * | 2008-04-18 | 2009-11-26 | Panasonic Electric Works Co Ltd | 情報表示システム |
JP2010517731A (ja) * | 2007-02-14 | 2010-05-27 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 身体運動を指導及び監督するフィードバック装置 |
-
2011
- 2011-09-22 WO PCT/JP2011/071667 patent/WO2012039467A1/ja active Application Filing
- 2011-09-22 CN CN201180045458.3A patent/CN103118647B/zh not_active Expired - Fee Related
- 2011-09-22 US US13/822,828 patent/US20130171601A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0956697A (ja) * | 1995-08-21 | 1997-03-04 | Matsushita Electric Ind Co Ltd | リハビリテーション支援装置 |
JPH09120464A (ja) * | 1995-08-21 | 1997-05-06 | Matsushita Electric Ind Co Ltd | リハビリテーション支援装置 |
JP2010517731A (ja) * | 2007-02-14 | 2010-05-27 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 身体運動を指導及び監督するフィードバック装置 |
JP2009201799A (ja) * | 2008-02-28 | 2009-09-10 | Xing Inc | 運動支援装置、運動支援方法及びコンピュータプログラム |
JP2009277195A (ja) * | 2008-04-18 | 2009-11-26 | Panasonic Electric Works Co Ltd | 情報表示システム |
Cited By (84)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11710549B2 (en) | 2010-11-05 | 2023-07-25 | Nike, Inc. | User interface for remote joint workout session |
US11094410B2 (en) | 2010-11-05 | 2021-08-17 | Nike, Inc. | Method and system for automated personal training |
US11915814B2 (en) | 2010-11-05 | 2024-02-27 | Nike, Inc. | Method and system for automated personal training |
US10583328B2 (en) | 2010-11-05 | 2020-03-10 | Nike, Inc. | Method and system for automated personal training |
US10420982B2 (en) | 2010-12-13 | 2019-09-24 | Nike, Inc. | Fitness training system with energy expenditure calculation that uses a form factor |
US10825561B2 (en) | 2011-11-07 | 2020-11-03 | Nike, Inc. | User interface for remote joint workout session |
US10188930B2 (en) | 2012-06-04 | 2019-01-29 | Nike, Inc. | Combinatory score having a fitness sub-score and an athleticism sub-score |
JP2017221686A (ja) * | 2012-06-04 | 2017-12-21 | ナイキ イノベイト シーブイ | フィットネスサブスコアとアスレチックサブスコアを含む組み合わせスコア |
JP2014057800A (ja) * | 2012-09-19 | 2014-04-03 | Nagasakiken Koritsu Daigaku Hojin | 動作評価支援装置および動作評価支援方法 |
JP2014068714A (ja) * | 2012-09-28 | 2014-04-21 | Kitasato Institute | 関節角度測定システム |
JP2014128409A (ja) * | 2012-12-28 | 2014-07-10 | Brother Ind Ltd | 情報処理装置、情報処理方法及びプログラム |
JP2014138661A (ja) * | 2013-01-21 | 2014-07-31 | Toshiba Corp | 医用画像表示装置及びプログラム |
US10170155B2 (en) | 2013-01-21 | 2019-01-01 | Toshiba Medical Systems Corporation | Motion information display apparatus and method |
WO2014112645A1 (ja) * | 2013-01-21 | 2014-07-24 | 株式会社東芝 | 動作情報表示装置及びプログラム |
JP2014142695A (ja) * | 2013-01-22 | 2014-08-07 | Ricoh Co Ltd | 情報処理装置、システム、画像投影装置、情報処理方法およびプログラム |
JP2014188146A (ja) * | 2013-03-27 | 2014-10-06 | Nippon Telegraph & Telephone East Corp | 運動姿勢評価装置、運動姿勢評価方法及びコンピュータプログラム |
CN105228708A (zh) * | 2013-04-02 | 2016-01-06 | 日本电气方案创新株式会社 | 身体动作评分装置、舞蹈评分装置、卡拉ok装置以及游戏装置 |
CN105228708B (zh) * | 2013-04-02 | 2017-08-29 | 日本电气方案创新株式会社 | 身体动作评分装置、舞蹈评分装置、卡拉ok装置以及游戏装置 |
WO2014162788A1 (ja) * | 2013-04-02 | 2014-10-09 | Necソリューションイノベータ株式会社 | 顔表情採点装置、ダンス採点装置、カラオケ装置、およびゲーム装置 |
JPWO2014162788A1 (ja) * | 2013-04-02 | 2017-02-16 | Necソリューションイノベータ株式会社 | 顔表情採点装置、ダンス採点装置、カラオケ装置、およびゲーム装置 |
WO2014162787A1 (ja) * | 2013-04-02 | 2014-10-09 | Necソリューションイノベータ株式会社 | 身体動作採点装置、ダンス採点装置、カラオケ装置及びゲーム装置 |
CN105050673A (zh) * | 2013-04-02 | 2015-11-11 | 日本电气方案创新株式会社 | 面部表情评分装置、舞蹈评分装置、卡拉ok装置以及游戏装置 |
CN105050673B (zh) * | 2013-04-02 | 2019-01-04 | 日本电气方案创新株式会社 | 面部表情评分装置、舞蹈评分装置、卡拉ok装置以及游戏装置 |
JP6082101B2 (ja) * | 2013-04-02 | 2017-02-15 | Necソリューションイノベータ株式会社 | 身体動作採点装置、ダンス採点装置、カラオケ装置及びゲーム装置 |
US10803762B2 (en) | 2013-04-02 | 2020-10-13 | Nec Solution Innovators, Ltd | Body-motion assessment device, dance assessment device, karaoke device, and game device |
JPWO2014162787A1 (ja) * | 2013-04-02 | 2017-02-16 | Necソリューションイノベータ株式会社 | 身体動作採点装置、ダンス採点装置、カラオケ装置及びゲーム装置 |
JP2014240800A (ja) * | 2013-06-12 | 2014-12-25 | 株式会社ブリヂストン | 検査補助装置 |
US20150003687A1 (en) * | 2013-07-01 | 2015-01-01 | Kabushiki Kaisha Toshiba | Motion information processing apparatus |
US9761011B2 (en) * | 2013-07-01 | 2017-09-12 | Toshiba Medical Systems Corporation | Motion information processing apparatus obtaining motion information of a subject performing a motion |
JP2015097639A (ja) * | 2013-11-19 | 2015-05-28 | 日本コントロールシステム株式会社 | カラオケ装置、ダンス採点方法、およびプログラム |
JP2015198834A (ja) * | 2014-04-09 | 2015-11-12 | 黄富表Huang Fubiao | MicrosoftKinect(登録商標)による脳血管障害者の回復段階の判定 |
JP2018171523A (ja) * | 2014-07-03 | 2018-11-08 | 帝人ファーマ株式会社 | リハビリテーション支援装置及びリハビリテーション支援装置の制御プログラム |
US10417931B2 (en) | 2014-07-03 | 2019-09-17 | Teijin Pharma Limited | Rehabilitation assistance device and program for controlling rehabilitation assistance device |
JP2017074350A (ja) * | 2014-12-25 | 2017-04-20 | ダンロップスポーツ株式会社 | スイング解析装置、方法及びプログラム |
KR101728775B1 (ko) * | 2015-01-28 | 2017-04-21 | 경북대학교 산학협력단 | 하프미러에 기반한 초점 및 시점 일치화 디스플레이 장치 및 방법, 그 방법을 수행하기 위한 기록 매체 |
JP2016174637A (ja) * | 2015-03-18 | 2016-10-06 | 株式会社タイトー | ダンス装置 |
KR102291039B1 (ko) * | 2015-03-25 | 2021-08-19 | 한국전자통신연구원 | 사용자 맞춤형 운동 서비스 제공 방법 및 장치 |
KR20160114996A (ko) * | 2015-03-25 | 2016-10-06 | 한국전자통신연구원 | 사용자 맞춤형 운동 서비스 제공 방법 및 장치 |
US10360444B2 (en) | 2015-06-16 | 2019-07-23 | Kabushiki Kaisha Toshiba | Image processing apparatus, method and storage medium |
JP2017004464A (ja) * | 2015-06-16 | 2017-01-05 | 株式会社東芝 | 画像処理装置、画像処理システム、画像処理方法及びプログラム |
WO2017017737A1 (ja) * | 2015-07-24 | 2017-02-02 | 富士通株式会社 | 表示方法、モニタ結果出力方法、情報処理装置および表示プログラム |
US10366617B2 (en) | 2015-09-08 | 2019-07-30 | Toyota Jidosha Kabushiki Kaisha | Walking training apparatus and walking training method |
JP2017060572A (ja) * | 2015-09-24 | 2017-03-30 | パナソニックIpマネジメント株式会社 | 機能訓練装置 |
US10460451B2 (en) | 2015-10-21 | 2019-10-29 | University Of Tsukuba | Evaluation information provision system and evaluation information provision method |
WO2017069115A1 (ja) * | 2015-10-21 | 2017-04-27 | 国立大学法人 筑波大学 | 評価情報提供システムおよび評価情報提供方法 |
JP2017077403A (ja) * | 2015-10-21 | 2017-04-27 | 国立大学法人 筑波大学 | 評価情報提供システムおよび評価情報提供方法 |
WO2018168756A1 (ja) * | 2017-03-15 | 2018-09-20 | 本田技研工業株式会社 | 歩行支援システム、歩行支援方法、及びプログラム |
JP2018153234A (ja) * | 2017-03-15 | 2018-10-04 | 本田技研工業株式会社 | 歩行支援システム、歩行支援方法、及びプログラム |
KR20180107015A (ko) * | 2017-03-21 | 2018-10-01 | 성균관대학교산학협력단 | 보행 보조기의 위험 상황 판단 방법 및 장치 |
KR102013353B1 (ko) * | 2017-03-21 | 2019-08-22 | 성균관대학교산학협력단 | 보행 보조기의 위험 상황 판단 방법 및 장치 |
JP2019008393A (ja) * | 2017-06-21 | 2019-01-17 | FunLife株式会社 | センサ、ハーフミラー及びディスプレイ装置を備えた装置 |
JP7011416B2 (ja) | 2017-07-25 | 2022-01-26 | 株式会社クオンタム | 検出装置、検出システム、処理装置及び検出プログラム |
JP2019024550A (ja) * | 2017-07-25 | 2019-02-21 | 株式会社クオンタム | 検出装置、検出システム、処理装置、検出方法、及び検出プログラム |
JP2019118783A (ja) * | 2018-01-10 | 2019-07-22 | ユインケア コーポレーション | 遠隔リハビリ分析装置およびその方法 |
JP2019150545A (ja) * | 2018-03-06 | 2019-09-12 | 株式会社 Mtg | 運動ブースの設置構造および方法 |
WO2019172298A1 (ja) * | 2018-03-06 | 2019-09-12 | 株式会社Mtg | 運動ブースの設置構造および方法 |
JP2022023867A (ja) * | 2018-05-29 | 2022-02-08 | キュリアサー プロダクツ インコーポレイテッド | 対話型トレーニング及びデモンストレーション用の反射ビデオディスプレイ機器及びその使用方法 |
KR101959079B1 (ko) * | 2018-10-08 | 2019-03-18 | 주식회사 마이베네핏 | 신체 측정 및 평가 방법 |
JP2020098291A (ja) * | 2018-12-19 | 2020-06-25 | カシオ計算機株式会社 | 表示装置、表示方法およびプログラム |
KR102127355B1 (ko) * | 2018-12-28 | 2020-06-29 | 전북대학교산학협력단 | 멀티형 슬링 운동 장치 및 시스템 |
JP2019205820A (ja) * | 2019-03-15 | 2019-12-05 | 株式会社コナミスポーツライフ | 運動指示装置、プログラム、トレーニングシステム、及び運動器具 |
WO2021149918A1 (ko) * | 2020-01-23 | 2021-07-29 | 고려대학교 산학협력단 | 골 연령 추정 방법 및 장치 |
US12102464B2 (en) | 2020-01-23 | 2024-10-01 | Korea University Research And Business Foundation | Bone age estimation method and apparatus |
JP6811349B1 (ja) * | 2020-03-31 | 2021-01-13 | 株式会社三菱ケミカルホールディングス | 情報処理装置、方法、プログラム |
JP2021159313A (ja) * | 2020-03-31 | 2021-10-11 | 株式会社三菱ケミカルホールディングス | 情報処理装置、方法、プログラム |
JP7507484B2 (ja) | 2020-03-31 | 2024-06-28 | 株式会社Shosabi | 情報処理装置、方法、プログラム |
WO2022054366A1 (ja) * | 2020-09-09 | 2022-03-17 | 高木 りか | 姿勢評価プログラム、姿勢評価装置、姿勢評価方法、及び姿勢評価システム |
JP7379302B2 (ja) | 2020-09-09 | 2023-11-14 | 高木 りか | 姿勢評価プログラム、姿勢評価装置、姿勢評価方法、及び姿勢評価システム。 |
JP2022045832A (ja) * | 2020-09-09 | 2022-03-22 | 高木 りか | 姿勢評価プログラム、姿勢評価装置、姿勢評価方法、及び姿勢評価システム。 |
WO2022075116A1 (ja) * | 2020-10-06 | 2022-04-14 | 国立研究開発法人産業技術総合研究所 | 情報表示装置及び情報表示方法 |
US11842027B2 (en) | 2021-03-17 | 2023-12-12 | Samsung Electronics Co., Ltd. | Electronic device and controlling method of electronic device |
JP2023008347A (ja) * | 2021-07-05 | 2023-01-19 | オプティメース株式会社 | 装着デバイス及び身体動作支援システム |
JP7611421B2 (ja) | 2021-11-10 | 2025-01-09 | 株式会社Nttドコモ | 映像作成装置、映像作成方法、およびプログラム |
WO2023084965A1 (ja) * | 2021-11-10 | 2023-05-19 | 株式会社Nttドコモ | 映像作成装置、映像作成方法、およびプログラム |
CN114267220B (zh) * | 2021-12-27 | 2024-01-26 | 林华 | 一种外科手术教学模拟方法及系统 |
CN114267220A (zh) * | 2021-12-27 | 2022-04-01 | 林华 | 一种外科手术教学模拟方法及系统 |
JP2023102856A (ja) * | 2022-01-13 | 2023-07-26 | 三菱ケミカルグループ株式会社 | プログラム、方法、および電子機器 |
JP7150387B1 (ja) | 2022-01-13 | 2022-10-11 | 三菱ケミカルグループ株式会社 | プログラム、方法、および電子機器 |
JP2023107347A (ja) * | 2022-01-24 | 2023-08-03 | 三菱ケミカルグループ株式会社 | 情報処理装置、方法、およびプログラム |
JP7201850B1 (ja) | 2022-01-24 | 2023-01-10 | 三菱ケミカルグループ株式会社 | 情報処理装置、方法、およびプログラム |
WO2023139944A1 (ja) * | 2022-01-24 | 2023-07-27 | 三菱ケミカルグループ株式会社 | 情報処理装置、方法、およびプログラム |
WO2023188280A1 (ja) * | 2022-03-31 | 2023-10-05 | 日本電気株式会社 | 動作情報生成装置、動作情報生成システム、動作情報生成方法、及び記録媒体 |
KR102619887B1 (ko) * | 2023-03-17 | 2024-01-02 | 한연오 | 신체활동 챌린지 서비스를 지원하는 시스템 및 방법 |
JP7620372B1 (ja) | 2023-04-18 | 2025-01-23 | 株式会社Orgo | 運動評価装置、運動評価方法、及び運動評価プログラム |
Also Published As
Publication number | Publication date |
---|---|
CN103118647A (zh) | 2013-05-22 |
CN103118647B (zh) | 2016-04-06 |
US20130171601A1 (en) | 2013-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012039467A1 (ja) | 運動支援システム | |
JP5624625B2 (ja) | 運動支援システム | |
US11033453B1 (en) | Neurocognitive training system for improving visual motor responses | |
US11337606B1 (en) | System for testing and/or training the vision of a user | |
US10342473B1 (en) | System and method for measuring eye movement and/or eye position and postural sway of a subject | |
JP6359343B2 (ja) | 動作情報処理装置及び方法 | |
US9526443B1 (en) | Force and/or motion measurement system and a method of testing a subject | |
JP5400077B2 (ja) | 運動支援システム | |
US12201363B1 (en) | System for testing and/or training the vision of a user | |
US20130171596A1 (en) | Augmented reality neurological evaluation method | |
US20150004581A1 (en) | Interactive physical therapy | |
US20150140534A1 (en) | Apparatus and method for gait training | |
US9418470B2 (en) | Method and system for selecting the viewing configuration of a rendered figure | |
CN109731292A (zh) | 一种基于虚拟现实技术的平衡能力测试与训练系统及方法 | |
JP2005198818A (ja) | 身体動作の学習支援システム及び学習支援方法 | |
CN109045651B (zh) | 高尔夫挥杆矫正系统 | |
JP6310255B2 (ja) | オプションを提示するための方法及び装置 | |
KR20190130761A (ko) | 사용자 인식 보행 동작 측정 시스템 및 이를 이용한 보행 동작 측정 방법 | |
Diaz-Monterrosas et al. | A brief review on the validity and reliability of Microsoft Kinect sensors for functional assessment applications | |
US20040059264A1 (en) | Footprint analyzer | |
JP2025018887A (ja) | 着座したユーザの脚部の姿勢を推定する家具型機器、付属機器及びシステム | |
KR101398193B1 (ko) | 캘리브레이션 장치 및 방법 | |
JP5552010B2 (ja) | 可動域訓練システム | |
JP2018038679A (ja) | 立位姿勢評価装置 | |
KR101553015B1 (ko) | 골프 스윙 연습 및 교정 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180045458.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11826912 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13822828 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2012535074 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11826912 Country of ref document: EP Kind code of ref document: A1 |