[go: up one dir, main page]

WO2014186537A1 - Game-based sensorimotor rehabilitator - Google Patents

Game-based sensorimotor rehabilitator Download PDF

Info

Publication number
WO2014186537A1
WO2014186537A1 PCT/US2014/038124 US2014038124W WO2014186537A1 WO 2014186537 A1 WO2014186537 A1 WO 2014186537A1 US 2014038124 W US2014038124 W US 2014038124W WO 2014186537 A1 WO2014186537 A1 WO 2014186537A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
game controller
force
sensor
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2014/038124
Other languages
French (fr)
Inventor
Preeti Raghavan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New York University NYU
Original Assignee
New York University NYU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New York University NYU filed Critical New York University NYU
Priority to EP14798022.1A priority Critical patent/EP2996551A4/en
Publication of WO2014186537A1 publication Critical patent/WO2014186537A1/en
Priority to US14/942,971 priority patent/US10299738B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/742Details of notification to user or communication with user or patient; User input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1124Determining motor skills
    • A61B5/1125Grasping motions of hands
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • A61B5/224Measuring muscular strength
    • A61B5/225Measuring muscular strength of the fingers, e.g. by monitoring hand-grip force
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Biofeedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • A61B5/6806Gloves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6823Trunk, e.g., chest, back, abdomen, hip
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/7455Details of notification to user or communication with user or patient; User input means characterised by tactile indication, e.g. vibration or electrical stimulation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0257Proximity sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves

Definitions

  • the invention includes systems and methods to help restore dexterity and functional hand use in patients with neurologic impairment from various conditions, for example, stroke, cerebral palsy, spinal cord injury, multiple sclerosis etc.
  • the objective of this game-based physical therapy schema is to engage individuals using a biofeedback strategy to facilitate a patient's brain to develop alternate neural pathways to overcome the damage produced by the stroke.
  • the game will provide an in-home option to encourage the patient to perform therapy more frequently to hasten their improvement and minimize the stress associated with travel to therapy centers.
  • the concept leverages current interest in computer games on small inexpensive wireless communication devices with sophisticated biomechanical algorithms to generate clinical metrics of performance and biofeedback. Data from the biomechanical analyses will be accessible to one or more providers and be interfaced with the electronic medical record for their input and direction.
  • the schema can also be adapted to generate training platforms for patients with other types of muscular deficits and in intact individuals.
  • Figure 1 Schematic of the components of the device.
  • Figure 2 Parts used to create device components.
  • Figure 3 Sensorized game controller shaped like real objects that need to be manipulated; e.g. a coffee cup, a stick, cooking and eating implements, grooming tools, writing tools, adapted typing and musical keyboards, objects of various shapes, sizes, weights and textures.
  • Figure 4 is a graph illustrating peak load force rate linearly scaled to weight and activity in lifting muscles.
  • Figures 5A-5B illustrates the determination of scaling error for a given trial across weights (Figure 5A) and trial-to-trial variability (Figure 5B) to measure successful adaptation and savings.
  • Figure 6A is a image of a virtual human grasping a cylinder
  • Figure 6B is a schematic of a finger grasping a cylinder
  • Figure 6C is a schematic mapping the joints of a hand.
  • Figure 7 illustrates graphs of preliminary results with time profiles of measured forces from sensors (left) and torques at each finger joint (right).
  • Figure 8 illustrates a sensorized cuff and a sensorized sleeve.
  • Figure 9 illustrates a sensorized vest.
  • Figure 10 illustrates an embodiment of a computer system of the present invention.
  • Figure 1 1 illustrates anticipatory muscle activation as it reduces abnormal directional biases and increases grasp efficiency.
  • Figure 12 illustrates a graph showing error reduction rates - reflective learning - with adaptation and repetition, with + indicating presence and - indicating absence.
  • Figures 13A-C illustrate congruent hand practice impact on adaptation restoration and muscle control in the affected hand post-stroke.
  • Figures 14A-B illustrates symptomatic pianists showing reduced activity in the upper and lower trapezius (LT) and increased activity in the finger extensor (EDC) at baseline compared with asymptomatic pianists. With LT activation, the slope (m) between EDC and LT in symptomatic pianists shifts closer to asymptomatic group compared to baseline (b).
  • Figure 14A illustrates normalized muscle activity
  • Figure 14B illustrates the extensor digitorum communis.
  • Figure 15 is a graph of the relationship between lower trapezius activity and grip force at lift during grasping with the affected hand post-stroke.
  • Figures 16A-16B illustrate graphs of alternate hand training improving the adaptation of fingertip forces to object weight ratio ( Figure 16A) and grasp execution ( Figure 16B).
  • Figure 17 illustrates an implementation of a system with a computing device in communication with a controller and game device.
  • Figure 18 illustrates an implementation of a dorsal glove.
  • Figures 19A-19B show linear relationships between variables measuring grasp adaptation and object weight and texture.
  • Figures 20A-20B show similar difference in peak force rates for specified weight and texture pairs in healthy individuals.
  • Figures 21A-C show correlation between the Slip Ratio and Peak Grip Force Rate for a series of lifts of different textured objects performed with bare hands, a thin film of Tegaderm over the fingertips, and a coating of foam over the fingers.
  • the purpose of the present invention is to create a portable therapeutic platform to facilitate skill training/retraining anywhere, anytime using a novel game controller, such as a game-based sensorimotor rehabilitator, that serves as a virtual coach to provide skill training in both healthy and in neurologically impaired individuals using real-time tactile, kinesthetic and visual feedback.
  • a novel game controller such as a game-based sensorimotor rehabilitator, that serves as a virtual coach to provide skill training in both healthy and in neurologically impaired individuals using real-time tactile, kinesthetic and visual feedback.
  • One implementation is comprised of three components: 1 ) the game controller, such as a cup or ball or other tool or implement, 2) a microcontroller, and 3) a computing device, such as a handheld/ laptop/ desktop computer running game-analysis software.
  • sensor information 101 is provided from the game controller to a microcontroller 102, which may be on the game controller or separate.
  • the microcontroller 102 determines the game controller's state, e.g. force being applied, orientation, acceleration, velocity, etc.
  • the microcontroller 102 determines if the game controller's state is within a predetermined set of ranges for a particular application, for example tilt of the game controller in a simulation involving a cup.
  • the microcontroller 102 provides information to a computing device 103 and to a feedback device 104.
  • an interactive gaming platform is implemented on a handheld, laptop or desktop computer systems with web access.
  • the system may include software-based portions and a hardware-in-the-loop interface through a real object. It is anticipated that such will have wide applicability in homes, gyms, and rehabilitation centers for patients with neurologic impairment from various conditions, e.g., stroke, cerebral palsy, spinal cord injury, multiple sclerosis, etc.
  • a successful rehabilitation outcome will restore dexterity and flexible functional hand use with wide access to facilitate tele-diagnostics and tele-treatment through easy modification of gaming parameters.
  • Certain implementations provide systems and methods to restore adaptation, facilitate grasp efficiency and normal directional biases during repetition and enhance the rate of learning to improve hand function and quality of life post stroke.
  • somatosensory and visual information from each side of the body is processed bilaterally, and interlimb coordination is mediated by motor representations in the parietal and premotor areas shared by both limbs.
  • the undamaged hemisphere shows increased activation, but eventually normal sensorimotor lateralization is restored in the stroke-affected hemisphere.
  • redundant homologous pathways in the intact hemisphere facilitate reorganization within the affected hemisphere by mechanisms such as unmasking projections from the intact motor cortex to the cervical spinal cord, and axonal sprouting and formation of novel subcortical projections.
  • the affected hand 1 101 of the individual in Figure 1 1 exhibits improved movement efficiency based upon the methods applied based on the long- latency in the lower trapezius observed in the unaffected hand's 1 1 1 1 movement.
  • Stage 1 Early learning is thought to depend on attentional mechanisms. Attention to specific task features or sources of sensory information (e.g. kinesthetic, tactile and visual) at this stage may assist with learning.
  • Stage 2 The next associative stage (Stage 2) is characterized by formation of visuomotor and sensorimotor associations through trial- and-error practice. At this stage, transfer of learning from one limb to the other occurs readily. Associations formed by using the unaffected hand can be used to improve the associations formed with the affected hand. In later learning (Stage 3) the movement is fine-tuned and becomes automatic. This stage of learning would require repeated practice with the affected hand which will lead to greater movement efficiency.
  • Certain implementations provide a solution for re-learning of skilled hand function where the individual can learn at their own pace with guidance from the game- based sensorimotor rehabilitator (SMR).
  • SMR game- based sensorimotor rehabilitator
  • the systems and methods enable the individuals to learn to interact with functional objects using just the right amount of force, tilt, finger torques, and muscle activity to regain lost skill.
  • real objects are customized with force, orientation and acceleration sensors and the object is virtualized on screen.
  • visual feedback is provided regarding the appropriateness of fingertip forces, and orientation in a systematic manner to facilitate learning of the correct associations.
  • the force and orientation information is also fed to computational biomechanical models, and software algorithms to inform the thresholds for visual feedback.
  • wearable position and muscle sensors embedded in a dorsal glove, cuff, sleeve or vest at key areas also send information regarding limb position to the computational models and algorithms.
  • the output of the algorithms is be sent to actuators located on the glove/cuff/sleeve/vest to provide tactile feedback to key areas in the same manner that a teacher or coach would apply a gently touch to provide a cue to move in a certain manner. Tactile feedback will turn on or off when the individual moves within or outside the set parameters of the movement.
  • Real time, trial-to-trial information from interaction with the object can be stored and communicated to the provider's electronic medical record for diagnosis, documentation of progress, and fine-tuning of stimulus parameters and gaming and feedback levels provided through a service provider.
  • Various implementations utilize methods for how and when practice should be facilitated with one hand, or both hands separately, alternately, or simultaneously.
  • FIG. 1 shows one implementation of a device for facilitating rehabilitation and training described above.
  • Figure 2 shows one implementation of a rehabilitation system 200.
  • a microcontroller 201 is configured to receive sensor information from force sensors 21 1 , which may be part of a game controller physically manipulated by the user.
  • An inertial measurement unite (IMU) is in communication with the microcontroller to provide inertial information, such as gyroscopic, accelerometer, and magnetometer information.
  • the microcontroller 201 is further in communication with a feedback mechanism 213.
  • the microcontroller 201 may also utilize a wireless communication unit 203, such as to communicate with a second microprocessor 205.
  • the microprocessor 201 is ultimately in communication with a computing device 207, such as a P.C. or laptop.
  • the game controller may have a form factor similar to a functional object, such as a coffee cup or ball.
  • the controller may consists of an object to be manipulated, such as a cup, and a wearable device for the user such as a vest.
  • the controllers are objects with various shapes and sizes equipped with force sensors, position/orientation sensors, inertial measurement units, and vibrational modules (Fig.3).
  • Figure 3 shows a cup-like game controller 301 having a cylindrical body.
  • Force sensors 303 such as force sensitive resistors, are positioned to correspond with a user's hand.
  • the force sensors may be positioned at multiple locations to correlate with fingertip and palm placement on the game controller 301 .
  • a separate feedback mechanism 320 is provided, such as a strap or bracelet, to provide feedback regarding the position of the game controller 301 .
  • Figure 8 illustrates another implementation where the system 800 includes a sleeve 810 as part of the game controller.
  • the sleeve 810 includes position sensors 805 as well as a wireless transmitter 801 .
  • the sensors 805 may be positioned to indicate the position of the upper arm and lower arm.
  • Vibro-tactile actuators 803 are included on the sleeve 810, for example on the flexor and extensor portions of the arm.
  • a cuff 830 may also be provided for further vibro-tactile feedback via vibro-tactile actuators 803.
  • FIG. 9 illustrates another implementation where the system 900 includes a vest 910 as part of the game controller.
  • the vest 910 includes position sensors 905 as well as a wireless transmitter 901 .
  • the sensors 905 may be positioned to indicate the position of the trunk.
  • Vibro-tactile actuators 903 are included on the vest 910, for example corresponding to the lower trapezius and pectoralis muscle positions.
  • the force sensors are, in one implementation, oriented orthogonally to measure both grip and load forces, and the rate of change of these forces will be computed.
  • Certain patient specific game controllers include software to communicate, such as wirelessly, to a computer. The information from the sensorized game controller is also integrated with positional and muscle recruitment information from sensors sewn into a cuff/ sleeve/vest worn by an individual in certain implementations.
  • the controllers include one or more of sensors and feedback systems.
  • the controllers may be adapted for specific treatment regimes. For example, a controller may provide information to software on an associated computer regarding the grip force, movement speed, and movement direction of an object the user is interacting with. The software may then provide feedback to the user through the controller, such as vibrotactile actuators.
  • One implementation of the game controller comprises a cylinder similar in size to a coffee cup and instrumented with two pressure transducers positioned at the thumb and middle finger locations and 3D spatial positioning transmitter. Foam or other tactile-deadening material, may be used on the game controller to prevent tactile information for the user, such as to simulate an injury.
  • the micro-computer interface includes circuitry to acquire force data from force transducers and 3D position transducers.
  • the controller communicates wirelessly with the game based software on the laptop that interacts and controls game avatars representing a real-world object.
  • the microcomputer interface and sensors are incorporated within in the controller.
  • the software development uses a 3D game engine's flexible input capacity to drive an interactive visualization of the patient's grip force with game-like reward structures and feedback systems.
  • Figure 17 shows a simplified device layout of Figure 2, where a system 1701 includes a game controller 1710 in communication with a controller 1701 serving as an interface between the game controller 1710 and a computing device 1707 (shown as a laptop).
  • a system 1701 includes a game controller 1710 in communication with a controller 1701 serving as an interface between the game controller 1710 and a computing device 1707 (shown as a laptop).
  • the microcontroller's imbedded data acquisition code continuously monitors the pressure transducer outputs and streams data via RS-232C protocol to the laptop's game-analysis software that, in turn, controls the avatar and provides an interactive, engaging and fun experience for the patient.
  • the laptop software communicates to the patient using visual and audio cues provided by the game concept.
  • the microcontroller acquires analog voltages associated with pressure transducer response and then transmits raw temporal force data, i.e. force changes over time for each transducer during the training session.
  • Circuitry within the microcontroller a) acquires the analog voltage signals from the pressure transducers; b) performs signal filtering and analog to digital conversion; c) adds a time stamp to each data pair using the micro-controller's system clock and then transmits the data stream via RS-232C serial format to the laptop.
  • This serial data stream is captured by subroutines within the laptop's game program that interprets the force and translates it into "Popeye squeezing the can".
  • interaction with the interactive "cup controller” 1710 provides vibro-tactile feedback in addition to visual and auditory feedback.
  • This version consists of an instrumented "cup controller” 1710 and a wrist band (not shown in Figure 17) with vibro-tactile actuators.
  • the cup controller 1710 is virtualized on a screen of the computing device 1707 as a cup which needs to be moved in order to catch balls on the screen. As the patient picks up the cup controller 1710, the goal is to learn to use the right amount of finger force to grasp the cylinder and the right tilt.
  • one implementation relates to interactive training algorithms.
  • the training algorithms are based on the principles of 'learning to learn', where learning simple sensorimotor associations within a low dimensional task structure will lead to faster acquisition of similar associations for other tasks. The stepwise introduction of variability and complexity across the stages oflearning will then lead to generalization of learning to novel tasks.
  • the training will be delivered using instructions on the computerized display to the user based on input from the sensorized objects and the dorsal glove/cuff/sleeve/vest implementations.
  • the instructions may cue practice with one or the other hand, and provide feedback and reward for correct performance.
  • a sensorized mat is utilized and the training algorithm is adapted for such sensorized mat to facilitate standardized object placement in the work space and a structured progression of training from simple sensory-motor mappings, to practice with more complex real-world objects leading to increased overall skill.
  • the training algorithms may be useful for training hand skill in other patient populations as well.
  • a video game platform is provided.
  • a game controller is virtualized on screen, and interfaced with game scenarios to provide visual feedback of the interaction with the object and augment the feedback based on the output of the analytic algorithms.
  • the game scenarios will provide interactive and enjoyable training. Game can be viewed on a variety of platforms, e.g. iPod, iPhone, iPad, Laptops, and the controller itself.
  • One implementation utilizes analytic software.
  • the game controller software will continuously monitor force transducer pressures and acceleration of the object from trial-to-trial from each hand separately for diagnostic information.
  • the user will be directed to modify the object by addition additional base plates to change its weight (w), or add stickers to the region over the force transducers to change the texture (t) of the object-grasp interface.
  • the user will also be asked to hold the object in various ways to gather information on the finger joint positions to compute estimated torques at the metacarpophalangeal (MCP) and interphalangeal (PIP) joints.
  • electronic inputs from position and muscle sensors are embedded in wearable modular garments comprising dorsal gloves/cuffs/sleeves/ and a vest at key locations to signal the position, orientation and activity at critical positions or muscle activity levels to provide information about the limb strategy used to manipulate the object.
  • electronic inputs from the device will be integrated into computational hand/arm models and software algorithms that will (1 ) integrate information from all the activated sensors on the object and the wearable garment, (2) process the information as detailed below to compute performance parameters, (3) send outputs to actuators located on the glove/cuff/sleeve/vest to activate key areas to provide tactile feedback about how the movement is being performed, and (4) transmit the clinical performance data and analytics to physicians or providers to integrate with the user's electronic medical record.
  • a dorsal glove 1800 shown in Figure 18, is utilized.
  • the dorsal glove 1800 rests on the top of the hand. It is fixed to the palmar surface using adjustable bands 1810.
  • the adjustable bands are fixed over the creases of the proximal interphalangeal joints and distal interphalangeal joints.
  • Sensors on the dorsal glove detect finger joint positions and angles of the proximal interphalangeal joints, distal interphalangeal joints, and metacarpophalangeal joints. Vibrotactile actuators embedded on the glove over the knuckles can trigger changes in position to specifically detected positions.
  • orthogonally placed force transducers will measure the rate of change of load force (vertical force) and grip force (normal force) which will be used to compute scaling error, or the mismatch between the forces needed to grasp the object and that actually produced.
  • Ideal performance metrics will be obtained from stored normative data of healthy individuals, or "normal" performance from the unaffected limb.
  • the reduction rate of the scaling error E, across J trials will be a measure for "successful" adaptation.
  • the scaling error will first be computed during initial diagnostic trials prior to training.
  • the aim of the training session will be to reduce the mismatch in scaling error between the normative data and the real time data, and/or between the "unaffected hand” and the "affected hand” which will be the reference hand data.
  • the peak LFR or peak GFR at each training trial will be compared with the average error for the given weight of the object from the "normal" trials.
  • a mismatch of >50% will signal augmentation of visual feedback on the level of grip force and prompt the individual to practice with the unaffected hand before practice with the affected hand until the mismatch is reduced to ⁇ 10%.
  • mismatch will lead to trigger of vibrotactile actuators in multiple locations on the flexor and extensor aspect across the wrist, the elbow and back (see below). Testing has shown that the mismatch is due to excessive coactivation which can be reduced by location specific vibrotactile stimulation.
  • trial-to-trial variability in the magnitude of the pLFR (Fig.5B) will also be computed independent of the scaling error as the variance of pLFR divided by the weight (to remove the weight effect). Decrease in trial-to-trial variability will precede successful adaptation. Scaling error and trial-to-trial variability will be computed on the peak grip force rate (pGFR) for adaptation to texture over every 5 trials. Higher trial to trial variability is correlated with increased grip forces to increase the safety margin, which indicates difficulty with processing of sensory information from the object. Increased trial-to-trial variability by 50% compared with that from the reference hand will trigger stimulation of vibrotactile actuators in the back of the vest to activate the lower trapezius muscle (Fig. 6) to increase stochastic noise in the system to enhance sensory processing.
  • pGFR peak grip force rate
  • figure torques are calculated.
  • the grip force (normal contact force) exerted by each finger, measured by the force sensors, will be used to map to the joint space in a computational hand/arm model, which will output the joint torque at each finger joint (each finger has 3 joints: the distal PIP, proximal PIP, and MCP).
  • the kinetic redundancy of the two fingers due to the muscle-induced actuations in the closed loop will be resolved using an optimization algorithm. Normally the finger torques are greater at the proximal PIP joint compared with that in the MCP joint (Fig. 7).
  • a compensatory limb strategy in a compensatory limb strategy is utilized.
  • Position sensors on the sleeve at the elbow and side of the vest will indicate whether the elbow is close to the trunk or away from it. If the elbow is away from the trunk and the muscle sensor over the lateral deltoid is activated, it indicates compensation by increasing the angle of the shoulder to orient the hand to grasp the object. This degree of compensation will correlate with excessive tilting of the object.
  • the user will have to learn to reduce the shoulder abduction angle by bringing the elbow position sensor and position sensor on the side of the vest closer together. When the distance between the elbow and the vest is greater than that in the reference database, vibro-tactile actuators in both locations will vibrate simultaneously and when the distance is reduced, these will stop vibrating indicating that the correct position has been learned.
  • the data will be stored and/or displayed in real time to an electronic medical record in the form of trial-to-trial data (graphs), or average data from a training session (in the form of plots and charts).
  • the data may be stored locally with the user, such as in memory associated with a user device.
  • a wearable feedback device may be a vibro-tactile feedback device, embedded in one or more of a dorsal glove, cuff, sleeve, or vest. Snug-fitting band, sleeve and vest available in various sizes will have pockets for placement of position sensors, vibro-tactile actuators and wireless transmitters in specific locations indicated in Figure 8 and Figure 9.
  • the actuators will be located on the flexor and extensor aspect and will turn on or off based on the output of the software algorithms.
  • the actuators will communicate wirelessly with the electronic interface from the sensorized object and the gaming platform providing an interactive experience and feedback to aid in rehabilitation.
  • Certain implementations include methods to facilitate hand-object interactions and rehabilitation. It is believed that the systematic facilitation of controlled hand-object interactions assists in the formation of specific sensory-motor associations in neurologically impaired individuals.
  • the configuration of the training algorithms in conjunction with information from the various sensors will prevent the use of compensatory strategies such as gripping the object too tightly or tilting it too much by providing tactile stimulation in key areas. It will facilitate practice with one or both hands separately, alternately, or simultaneously.
  • Tasks and stimulus features can be programmed and presented to the individual using virtual reality and updated based on trial-to-trial performance. These methods can be used in conjunction with peripheral or central electric stimulation to reinforce new movement patterns.
  • the device may be utilized with a health individual for training rather than rehabilitation purposes.
  • Ideal placement of a hand for example, can be modeled and feedback provided when the hand varies from this model.
  • placement of hands during piano playing can be monitored to provided feedback on physical orientation and placement of the hands that may not be evident merely from the notes being played.
  • FIG. 10 shows an exemplary block diagram of an exemplary embodiment of a system 100 according to the present disclosure.
  • Such processing/computing arrangement 1 10 can be, e.g., entirely or a part of, or include, but not limited to, a computer/processor that can include, e.g., one or more microprocessors, and use instructions stored on a computer-accessible medium (e.g., RAM, ROM, hard drive, or other storage device).
  • a computer-accessible medium e.g., RAM, ROM, hard drive, or other storage device.
  • a computer-accessible medium 120 (e.g., as described herein, a storage device such as a hard disk, floppy disk, memory stick, CD- ROM, RAM, ROM, etc., or a collection thereof) can be provided (e.g., in communication with the processing arrangement 1 10).
  • the computer-accessible medium 120 may be a non-transitory computer-accessible medium.
  • the computer-accessible medium 120 can contain executable instructions 130 thereon.
  • a storage arrangement 140 can be provided separately from the computer-accessible medium 120, which can provide the instructions to the processing arrangement 1 10 so as to configure the processing arrangement to execute certain exemplary procedures, processes and methods, as described herein, for example.
  • Figures 21A-C show the relationship between the slip ratio and the peak grip force rates for a wide range of textures. Note that the slip ratio is in a narrower range when using bare hands and when grasping the objects with a thin film of Tegaderm over the fingertips. The stronger correlation between the slip ratio and the peak grip force rate suggests that tactile sensibility was not impaired by Tegaderm. However the foam coating clearly impaired tactile sensibility as can be seen from the significantly weaker correlation.
  • the alternate hand practice strategy can restore adaptation of fingertip forces after stroke.
  • Adaptation with the affected hand is impaired after stroke, but that it can be temporarily restored after prior practice with the unaffected hand.
  • the mechanisms underlying lack of adaptation of fingertip forces to object weight [measured by the difference in peak load force rate (pLFR) for two weights] were examined by considering muscle activity using surface and intramuscular (from FCU, ECRL & BRD muscles to avoid cross-talk) electrodes. 14 patients with post-stroke hemiparesis and age-matched controls lifted an instrumented grip object equipped with force sensors. On the first lift, the pLFR was not scaled to weight either in controls or in patients (Fig. 13A, light and heavy bars of similar height), suggesting that the association between object weight and fingertip forces was not learned yet.
  • Training consisted of eight 45-minute sessions conducted twice a week for 4 weeks when patients grasped and lifted everyday objects first with their unaffected hand and then with the affected hand, in a 1 :1 alternating manner, by isolating movement at the shoulder, elbow or wrist joints. Training was progressed to more difficult grasp orientations, using heavier or lighter objects, and combining grasp and lift with transport and placing movements to simulate real world tasks. The goal of training was to attain symmetrical grasping patterns with the two hands. Subjects grasped and lifted an instrumented grip device pre- and post- intervention.
  • Adaptation of fingertip forces to object weight was assessed by the difference in peak load force rates for the light and heavy weights (Fig. 16A, see also Fig. 13) and clearly improved post intervention.
  • Grasp execution was assessed with the preload phase duration (PLD) which is the time taken to stabilize grasp and reflects grip-load coordination.
  • PLD preload phase duration
  • the PLD has been found to be a robust and sensitive measure of grasp impairment post stroke that correlates with several tests of hand function. Although the patients still had considerable impairment, the PLD showed significant improvement post intervention in both the affected and unaffected hands (Fig. 16B). Subjects also showed clinical improvement in tactile sensibility, higher order sensory integration (stereognosis and 2-point discrimination), pinch strength, timing on fine motor tasks (8-13) of the Wolf Motor Function Test and quality of life measured with the Stroke Impact Scale (Table 1 ).
  • System 100 may also include a display or output device, an input device such as a key-board, mouse, touch screen or other input device, and may be connected to additional systems via a logical network.
  • Logical connections may include a local area network (LAN) and a wide area network (WAN) that are presented here by way of example and not limitation.
  • LAN local area network
  • WAN wide area network
  • network computing environments can typically encompass many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like.
  • Embodiments of the invention may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network.
  • program modules may be located in both local and remote memory storage devices.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Tools (AREA)

Abstract

Treatment of neurological injury through motor relearning. A game-based sensorimotor rehabilitator that enables individuals to interact with the functional objects using the appropriate amount of force, tilt, finger movement, and muscle activity to regain lost skill due to injury.

Description

GAME-BASED SENSORIMOTOR REHABILITATOR
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional Application No. 61/824,258 filed May 16, 2013, which is incorporated herein by reference in its entirety.
BACKGROUND OF THE INVENTION
[0002] Even basic tasks such as feeding ourselves, manipulating tools, and performing activities of daily living require some skill. Using specialized tools, for example by machinists such as welders, musicians, chefs and surgeons requires even more skill acquired through motor learning which requires extended practice. One main event that can impact the ability to perform these tasks is stroke.
[0003] There are over 8 million stroke survivors in the US. Majority of them do not have access to rehabilitation and have persistent hand dysfunction leading to chronic disability. Recovery of hand function after neurological injury such as stroke, cerebral palsy, multiple sclerosis, spinal cord injury etc. is extremely challenging. Recovery occurs through motor re-learning during which specific sensory-motor associations are formed to shape hand posture to match that of the object, and scale fingertip forces to the weight and texture of objects. These associations need to be fine- tuned through practice and established in long-term procedural memory to regain skill. However forming such task-specific memory requires flexible interaction with various types of objects in a systematic manner, appropriately rewarded for accuracy that can be repeated without becoming tiresome. Furthermore, it is challenging to facilitate the formation of specific sensory-motor associations because individuals tend to use compensatory strategies, such as increasing the abduction angle at the shoulder, and excessively co-activating the flexor and extensor muscles across a joint when attempting to complete the task. These compensatory strategies reinforce abnormal movements which makes it more difficult to regain skill in the long term. [0004] Motor learning has been shown to occur over multiple time-scales. At least three underlying processes are thought to contribute to learning:(1 ) error-based adaptation (fast process), (2) repetition that alters movement biases depending on what is repeated (slow process), and (3) reinforcement that occurs when error is reduced successfully and leads to savings or faster re-learning on subsequent attempts. Currently available interactive platforms do not facilitate real-time interaction with kinesthetic and haptic feedback in a controlled and paced manner for rehabilitation. There is a need for systems and methods to enhance motor re-learning for restoration of hand function, especially after stroke.
SUMMARY OF THE INVENTION
[0005] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the following drawings and the detailed description.
[0006] In one implementation, the invention includes systems and methods to help restore dexterity and functional hand use in patients with neurologic impairment from various conditions, for example, stroke, cerebral palsy, spinal cord injury, multiple sclerosis etc. The objective of this game-based physical therapy schema is to engage individuals using a biofeedback strategy to facilitate a patient's brain to develop alternate neural pathways to overcome the damage produced by the stroke. The game will provide an in-home option to encourage the patient to perform therapy more frequently to hasten their improvement and minimize the stress associated with travel to therapy centers. The concept leverages current interest in computer games on small inexpensive wireless communication devices with sophisticated biomechanical algorithms to generate clinical metrics of performance and biofeedback. Data from the biomechanical analyses will be accessible to one or more providers and be interfaced with the electronic medical record for their input and direction. The schema can also be adapted to generate training platforms for patients with other types of muscular deficits and in intact individuals.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The foregoing and other features of the present disclosure will become more fully apparent from the following description and appended claims, taken in conjunction with the accompanying drawings. Understanding that these drawings depict only several embodiments in accordance with the disclosure and are, therefore, not to be considered limiting of its scope, the disclosure will be described with additional specificity and detail through use of the accompanying drawings.
[0008] Figure 1 : Schematic of the components of the device.
[0009] Figure 2: Parts used to create device components.
[0010] Figure 3: Sensorized game controller shaped like real objects that need to be manipulated; e.g. a coffee cup, a stick, cooking and eating implements, grooming tools, writing tools, adapted typing and musical keyboards, objects of various shapes, sizes, weights and textures.
[0011] Figure 4 is a graph illustrating peak load force rate linearly scaled to weight and activity in lifting muscles.
[0012] Figures 5A-5B illustrates the determination of scaling error for a given trial across weights (Figure 5A) and trial-to-trial variability (Figure 5B) to measure successful adaptation and savings.
[0013] Figure 6A is a image of a virtual human grasping a cylinder, Figure 6B is a schematic of a finger grasping a cylinder; Figure 6C is a schematic mapping the joints of a hand.
[0014] Figure 7 illustrates graphs of preliminary results with time profiles of measured forces from sensors (left) and torques at each finger joint (right).
[0015] Figure 8 illustrates a sensorized cuff and a sensorized sleeve. [0016] Figure 9 illustrates a sensorized vest.
[0017] Figure 10 illustrates an embodiment of a computer system of the present invention.
[0018] Figure 1 1 illustrates anticipatory muscle activation as it reduces abnormal directional biases and increases grasp efficiency.
[0019] Figure 12 illustrates a graph showing error reduction rates - reflective learning - with adaptation and repetition, with + indicating presence and - indicating absence.
[0020] Figures 13A-C illustrate congruent hand practice impact on adaptation restoration and muscle control in the affected hand post-stroke.
[0021] Figures 14A-B illustrates symptomatic pianists showing reduced activity in the upper and lower trapezius (LT) and increased activity in the finger extensor (EDC) at baseline compared with asymptomatic pianists. With LT activation, the slope (m) between EDC and LT in symptomatic pianists shifts closer to asymptomatic group compared to baseline (b). Figure 14A illustrates normalized muscle activity, Figure 14B illustrates the extensor digitorum communis.
[0022] Figure 15 is a graph of the relationship between lower trapezius activity and grip force at lift during grasping with the affected hand post-stroke.
[0023] Figures 16A-16B illustrate graphs of alternate hand training improving the adaptation of fingertip forces to object weight ratio (Figure 16A) and grasp execution (Figure 16B).
[0024] Figure 17 illustrates an implementation of a system with a computing device in communication with a controller and game device.
[0025] Figure 18 illustrates an implementation of a dorsal glove.
[0026] Figures 19A-19B show linear relationships between variables measuring grasp adaptation and object weight and texture. [0027] Figures 20A-20B show similar difference in peak force rates for specified weight and texture pairs in healthy individuals.
[0028] Figures 21A-C show correlation between the Slip Ratio and Peak Grip Force Rate for a series of lifts of different textured objects performed with bare hands, a thin film of Tegaderm over the fingertips, and a coating of foam over the fingers.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0029] In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, and designed in a wide variety of different configurations, all of which are explicitly contemplated and made part of this disclosure.
[0030] The purpose of the present invention is to create a portable therapeutic platform to facilitate skill training/retraining anywhere, anytime using a novel game controller, such as a game-based sensorimotor rehabilitator, that serves as a virtual coach to provide skill training in both healthy and in neurologically impaired individuals using real-time tactile, kinesthetic and visual feedback. One implementation is comprised of three components: 1 ) the game controller, such as a cup or ball or other tool or implement, 2) a microcontroller, and 3) a computing device, such as a handheld/ laptop/ desktop computer running game-analysis software. In one implementation, illustrated in Fig. 1 , sensor information 101 is provided from the game controller to a microcontroller 102, which may be on the game controller or separate. The microcontroller 102 determines the game controller's state, e.g. force being applied, orientation, acceleration, velocity, etc. The microcontroller 102 determines if the game controller's state is within a predetermined set of ranges for a particular application, for example tilt of the game controller in a simulation involving a cup. The microcontroller 102 provides information to a computing device 103 and to a feedback device 104.
[0031] In one aspect, an interactive gaming platform is implemented on a handheld, laptop or desktop computer systems with web access. The system may include software-based portions and a hardware-in-the-loop interface through a real object. It is anticipated that such will have wide applicability in homes, gyms, and rehabilitation centers for patients with neurologic impairment from various conditions, e.g., stroke, cerebral palsy, spinal cord injury, multiple sclerosis, etc. A successful rehabilitation outcome will restore dexterity and flexible functional hand use with wide access to facilitate tele-diagnostics and tele-treatment through easy modification of gaming parameters. Certain implementations provide systems and methods to restore adaptation, facilitate grasp efficiency and normal directional biases during repetition and enhance the rate of learning to improve hand function and quality of life post stroke.
[0032] Motor adaptation occurs when sensory information relevant to the task is extracted to form sensorimotor associations, which are used to predict accurate responses to similar actions in the future. Adaptation is the pivotal process in that it utilizes error feedback to identify the optimal movement for a task faster, which when reinforced through repetition can enhance learning for sustained changes in skill. It has been noted that patients are unable to adapt their fingertip forces and movements predictively to the expected consequences of an action with the affected hand post- stroke, perhaps because sensorimotor information from the affected side is inaccurate and/or its integration is disrupted.
[0033] However, somatosensory and visual information from each side of the body is processed bilaterally, and interlimb coordination is mediated by motor representations in the parietal and premotor areas shared by both limbs. Early in recovery after stroke, the undamaged hemisphere shows increased activation, but eventually normal sensorimotor lateralization is restored in the stroke-affected hemisphere. This suggests that redundant homologous pathways in the intact hemisphere facilitate reorganization within the affected hemisphere by mechanisms such as unmasking projections from the intact motor cortex to the cervical spinal cord, and axonal sprouting and formation of novel subcortical projections. In the chronic post- stroke period, independent finger movements of the recovered hand show a bilateral increase in regional cerebral blood flow in the dorsolateral and medial premotor areas, which are involved in motor planning. Disruption of activity in the dorsal premotor cortex of the intact hemisphere results in degraded behavior in the paretic hand. Thus, sensory information from each hand is represented bilaterally, providing redundant circuits that can be harnessed for recovery. These results suggest that actions from each hand are represented bilaterally and representations in the intact hemisphere can facilitate planning and control in the affected hand post stroke. Therefore, certain implementations utilize practice with the unaffected hand followed by practice with the affected hand to achieve adaptation of fingertip forces.
[0034] First, alternate hand training is provided to restore adaptation of fingertip forces in the affected hand. Skilled hand function requires that users grasp objects of various weights, textures, and shapes. It is believed that successful transfer of adaptation from the unaffected hand to the affected hand occurs readily when task- relevant sensory information is provided. This information may be kinesthetic, tactile or visual, and can be provided using alternative hand practice to form accurate sensorimotor associations. The understanding of how sensory modalities interact, given existing sensorimotor impairments, informs the structure of effective training protocols for adaptation.
[0035] Second, postural strategies are delineated to enhance alternate hand training for increased movement efficiency. Adaptation occurs by detecting error and then correcting for it predictively on subsequent practice. Short-latency spinal reflexes and intrinsic biomechanical properties of muscles contribute to reactive error correction mechanisms which adjust motor output for novel or unplanned movements. These reactive mechanisms are accentuated post stroke due to central disinhibition, leading to spasticity, synergistic movement patterns, inappropriate muscle co-activation, and use of excessive grip forces, all of which reduce the quality of movement-related sensory feedback and affect learning. However, error correction mechanisms also include supraspinal long-latency reflexes that take into account the consequences of the net torques on interconnected joints, and reflect an internal representation of the dynamics of the entire limb. Facilitation of these long-latency mechanisms can generate anticipatory postural responses that may reduce the need for reactive error correction mechanisms (Fig.1 1 ). The affected hand 1 101 of the individual in Figure 1 1 exhibits improved movement efficiency based upon the methods applied based on the long- latency in the lower trapezius observed in the unaffected hand's 1 1 1 1 movement.
[0036] Stimulation of forearm and hand afferents has been shown to evoke long- latency reflexes in the lower trapezius muscle - a scapular adductor and anti-gravity upper-limb stabilizer. Voluntary activation of the trapezius further increases the amplitude of the long-latency reflex. Tasks that demand greater dexterity produce even larger bilateral responses in the trapezius and other upper limb stabilizers in healthy individuals, but this modulation is disrupted after stroke. Studies on post-stroke reaching have revealed abnormal joint coupling and muscle co-activation patterns that produce inefficient movements. However, these abnormal patterns are modifiable by the use of gravity loading and trunk restraint, suggesting that postural stability can affect the processing of peripheral movement-related proprioceptive information. For example, studies on skilled pianists show that voluntary activation of the lower trapezius muscle, assisted by biofeedback, leads to greater efficiency of finger movements during playing. In addition, recent preliminary data show that lower levels of postural muscle activation are associated with higher grip forces after stroke.
[0037] Spasticity and abnormal motor synergies make repetition challenging after stroke. It is believed that activation of anti-gravity upper limb stabilizing postural muscles (e.g. lower trapezius) is associated with increased movement efficiency. Thus, enhancing alternate hand practice with postural muscle activation may reduce abnormal post-stroke directional biases and facilitate repetition of more efficient movements. It is believed that triggering active postural strategies through lower trapezius stimulation will increase grasp efficiency. As further described below, certain implementations measure fingertip forces, arm compensation, and electromyographic (EMG) activity in arm and back muscles to assess directional biases.
[0038] Third, certain implementations determine and utilize learning rates across multiple time-scales and stages of skill learning for improvement in hand function post stroke. Adaptation of sensory-motor mappings is a fast learning process, which leads to a rapid reduction in movement error, and typically takes only a few trials; however it is easily forgotten. Repetition, on the other hand, is an error-independent process, and leads to a slow tuning of directional biases toward the repeated movement. For example, the rate of within-session and between-session learning across stages of skill learning with 'enhanced alternate hand training' can be examined and compared to training with the affected hand alone, using a novel task panel and structured training protocols. Successful motor learning requires a combination of fast and slow processes (Fig 1 1 ). Appropriate sensory-motor mappings learned through adaptation must be reinforced over time through repetition across various stages of motor learning. Early learning (Stage 1 ) is thought to depend on attentional mechanisms. Attention to specific task features or sources of sensory information (e.g. kinesthetic, tactile and visual) at this stage may assist with learning. The next associative stage (Stage 2) is characterized by formation of visuomotor and sensorimotor associations through trial- and-error practice. At this stage, transfer of learning from one limb to the other occurs readily. Associations formed by using the unaffected hand can be used to improve the associations formed with the affected hand. In later learning (Stage 3) the movement is fine-tuned and becomes automatic. This stage of learning would require repeated practice with the affected hand which will lead to greater movement efficiency. Training algorithms are thus developed for each stage of learning. [0039] Certain implementations provide a solution for re-learning of skilled hand function where the individual can learn at their own pace with guidance from the game- based sensorimotor rehabilitator (SMR). The systems and methods enable the individuals to learn to interact with functional objects using just the right amount of force, tilt, finger torques, and muscle activity to regain lost skill.
[0040] In one implementation, real objects are customized with force, orientation and acceleration sensors and the object is virtualized on screen. As the individual manipulates the object or objects, visual feedback is provided regarding the appropriateness of fingertip forces, and orientation in a systematic manner to facilitate learning of the correct associations. The force and orientation information is also fed to computational biomechanical models, and software algorithms to inform the thresholds for visual feedback. Further, in certain implementations, wearable position and muscle sensors embedded in a dorsal glove, cuff, sleeve or vest at key areas also send information regarding limb position to the computational models and algorithms. The output of the algorithms is be sent to actuators located on the glove/cuff/sleeve/vest to provide tactile feedback to key areas in the same manner that a teacher or coach would apply a gently touch to provide a cue to move in a certain manner. Tactile feedback will turn on or off when the individual moves within or outside the set parameters of the movement. Real time, trial-to-trial information from interaction with the object can be stored and communicated to the provider's electronic medical record for diagnosis, documentation of progress, and fine-tuning of stimulus parameters and gaming and feedback levels provided through a service provider. Various implementations utilize methods for how and when practice should be facilitated with one hand, or both hands separately, alternately, or simultaneously.
[0041] One implementation of a device for facilitating rehabilitation and training described above incorporates several components as shown in the flow chart of Figure 1 , using the parts shown in Figure 2. Figure 2 shows one implementation of a rehabilitation system 200. A microcontroller 201 is configured to receive sensor information from force sensors 21 1 , which may be part of a game controller physically manipulated by the user. An inertial measurement unite (IMU) is in communication with the microcontroller to provide inertial information, such as gyroscopic, accelerometer, and magnetometer information. The microcontroller 201 is further in communication with a feedback mechanism 213. The microcontroller 201 may also utilize a wireless communication unit 203, such as to communicate with a second microprocessor 205. The microprocessor 201 is ultimately in communication with a computing device 207, such as a P.C. or laptop.
[0042] In one implementation, the game controller may have a form factor similar to a functional object, such as a coffee cup or ball. In further implementation the controller may consists of an object to be manipulated, such as a cup, and a wearable device for the user such as a vest. The controllers are objects with various shapes and sizes equipped with force sensors, position/orientation sensors, inertial measurement units, and vibrational modules (Fig.3). Figure 3 shows a cup-like game controller 301 having a cylindrical body. Force sensors 303, such as force sensitive resistors, are positioned to correspond with a user's hand. For example, the force sensors may be positioned at multiple locations to correlate with fingertip and palm placement on the game controller 301 . A separate feedback mechanism 320 is provided, such as a strap or bracelet, to provide feedback regarding the position of the game controller 301 .
[0043] Figure 8 illustrates another implementation where the system 800 includes a sleeve 810 as part of the game controller. The sleeve 810 includes position sensors 805 as well as a wireless transmitter 801 . The sensors 805 may be positioned to indicate the position of the upper arm and lower arm. Vibro-tactile actuators 803 are included on the sleeve 810, for example on the flexor and extensor portions of the arm. A cuff 830 may also be provided for further vibro-tactile feedback via vibro-tactile actuators 803.
[0044] Figure 9 illustrates another implementation where the system 900 includes a vest 910 as part of the game controller. The vest 910 includes position sensors 905 as well as a wireless transmitter 901 . The sensors 905 may be positioned to indicate the position of the trunk. Vibro-tactile actuators 903 are included on the vest 910, for example corresponding to the lower trapezius and pectoralis muscle positions.
[0045] The force sensors are, in one implementation, oriented orthogonally to measure both grip and load forces, and the rate of change of these forces will be computed. Certain patient specific game controllers include software to communicate, such as wirelessly, to a computer. The information from the sensorized game controller is also integrated with positional and muscle recruitment information from sensors sewn into a cuff/ sleeve/vest worn by an individual in certain implementations. The controllers include one or more of sensors and feedback systems. The controllers may be adapted for specific treatment regimes. For example, a controller may provide information to software on an associated computer regarding the grip force, movement speed, and movement direction of an object the user is interacting with. The software may then provide feedback to the user through the controller, such as vibrotactile actuators.
[0046] One implementation of the game controller comprises a cylinder similar in size to a coffee cup and instrumented with two pressure transducers positioned at the thumb and middle finger locations and 3D spatial positioning transmitter. Foam or other tactile-deadening material, may be used on the game controller to prevent tactile information for the user, such as to simulate an injury. The micro-computer interface includes circuitry to acquire force data from force transducers and 3D position transducers. The controller communicates wirelessly with the game based software on the laptop that interacts and controls game avatars representing a real-world object. The microcomputer interface and sensors are incorporated within in the controller. The software development uses a 3D game engine's flexible input capacity to drive an interactive visualization of the patient's grip force with game-like reward structures and feedback systems.
[0047] As seen in Figure 17 above, the current prototype's game objective is to ask the individual, represented by an avatar, to squeeze the foam cylinder until the Avatar can open his can of spinach. It will be appreciated that pop-culture or other such references can be used to provide simple clues for the associated functionality as well as familiarity for users.. Figure 17 shows a simplified device layout of Figure 2, where a system 1701 includes a game controller 1710 in communication with a controller 1701 serving as an interface between the game controller 1710 and a computing device 1707 (shown as a laptop). The microcontroller's imbedded data acquisition code continuously monitors the pressure transducer outputs and streams data via RS-232C protocol to the laptop's game-analysis software that, in turn, controls the avatar and provides an interactive, engaging and fun experience for the patient. The laptop software communicates to the patient using visual and audio cues provided by the game concept. The microcontroller acquires analog voltages associated with pressure transducer response and then transmits raw temporal force data, i.e. force changes over time for each transducer during the training session. Circuitry within the microcontroller a) acquires the analog voltage signals from the pressure transducers; b) performs signal filtering and analog to digital conversion; c) adds a time stamp to each data pair using the micro-controller's system clock and then transmits the data stream via RS-232C serial format to the laptop. This serial data stream is captured by subroutines within the laptop's game program that interprets the force and translates it into "Popeye squeezing the can".
[0048] In another implementation, interaction with the interactive "cup controller" 1710 provides vibro-tactile feedback in addition to visual and auditory feedback. This version consists of an instrumented "cup controller" 1710 and a wrist band (not shown in Figure 17) with vibro-tactile actuators. The cup controller 1710 is virtualized on a screen of the computing device 1707 as a cup which needs to be moved in order to catch balls on the screen. As the patient picks up the cup controller 1710, the goal is to learn to use the right amount of finger force to grasp the cylinder and the right tilt. If they squeeze too hard the balls go flying out, if they tilt the cup too far the wristband provides directional vibro-tactile feedback: vibration to the right side of the wrist if the tilt is excessive on the right, and vibration to the left if the tilt is excessive on the other side. [0049] To facilitate training in a systematic and controlled manner, one implementation relates to interactive training algorithms. The training algorithms are based on the principles of 'learning to learn', where learning simple sensorimotor associations within a low dimensional task structure will lead to faster acquisition of similar associations for other tasks. The stepwise introduction of variability and complexity across the stages oflearning will then lead to generalization of learning to novel tasks. The training will be delivered using instructions on the computerized display to the user based on input from the sensorized objects and the dorsal glove/cuff/sleeve/vest implementations. The instructions may cue practice with one or the other hand, and provide feedback and reward for correct performance. In one implementation, a sensorized mat is utilized and the training algorithm is adapted for such sensorized mat to facilitate standardized object placement in the work space and a structured progression of training from simple sensory-motor mappings, to practice with more complex real-world objects leading to increased overall skill. The training algorithms may be useful for training hand skill in other patient populations as well.
[0050] In one implementation, a video game platform is provided. A game controller is virtualized on screen, and interfaced with game scenarios to provide visual feedback of the interaction with the object and augment the feedback based on the output of the analytic algorithms. The game scenarios will provide interactive and enjoyable training. Game can be viewed on a variety of platforms, e.g. iPod, iPhone, iPad, Laptops, and the controller itself.
[0051] One implementation utilizes analytic software. During typical operation, the game controller software will continuously monitor force transducer pressures and acceleration of the object from trial-to-trial from each hand separately for diagnostic information. For example, the user will be directed to modify the object by addition additional base plates to change its weight (w), or add stickers to the region over the force transducers to change the texture (t) of the object-grasp interface. The user will also be asked to hold the object in various ways to gather information on the finger joint positions to compute estimated torques at the metacarpophalangeal (MCP) and interphalangeal (PIP) joints. Figures 6A-Cillustrate a hand grasping a cylinder. The information obtained from these directed grasps will be used to set thresholds for visual and tactile feedback.
[0052] In one implementation, electronic inputs from position and muscle sensors are embedded in wearable modular garments comprising dorsal gloves/cuffs/sleeves/ and a vest at key locations to signal the position, orientation and activity at critical positions or muscle activity levels to provide information about the limb strategy used to manipulate the object. As a patient grasps the object, electronic inputs from the device will be integrated into computational hand/arm models and software algorithms that will (1 ) integrate information from all the activated sensors on the object and the wearable garment, (2) process the information as detailed below to compute performance parameters, (3) send outputs to actuators located on the glove/cuff/sleeve/vest to activate key areas to provide tactile feedback about how the movement is being performed, and (4) transmit the clinical performance data and analytics to physicians or providers to integrate with the user's electronic medical record.
[0053] In one implementation, a dorsal glove 1800, shown in Figure 18, is utilized. The dorsal glove 1800 rests on the top of the hand. It is fixed to the palmar surface using adjustable bands 1810. In one implementation, the adjustable bands are fixed over the creases of the proximal interphalangeal joints and distal interphalangeal joints. Sensors on the dorsal glove detect finger joint positions and angles of the proximal interphalangeal joints, distal interphalangeal joints, and metacarpophalangeal joints. Vibrotactile actuators embedded on the glove over the knuckles can trigger changes in position to specifically detected positions.
[0054] The information provided allows for computation of performance parameters. For example, orthogonally placed force transducers will measure the rate of change of load force (vertical force) and grip force (normal force) which will be used to compute scaling error, or the mismatch between the forces needed to grasp the object and that actually produced. Ideal performance metrics will be obtained from stored normative data of healthy individuals, or "normal" performance from the unaffected limb.
[0055] The error definition for any trial is motivated by the ideal case of scaling where, given specific weights, a linear relationship is observed in the peak rate of change of the vertical load force (pLFR, Fig. 4). For a given trial, (e.g. Trial 3 in Fig. 5), the N weights in ascending order are wl, w2,...wN with corresponding peak load force rates Fl,F2,..FN . Taking a weight w„and its associated pLFR,¾as the reference pair, H one can determine the ideal peak load force rates for the other weights wkas F=^_F , and the mean sum-squared error as e =∑∑(F _Fy , where the subscript in en indicates that the nth weight-pLFR pair is used as the reference unit for the calculation. The scaling error of a given trial j can then be calculated by averaging ^ based on different reference pairs, „ i - , where dividing by the normalizing factor πι __L will allow comparison of the scaling error between sessions when different weights are applied. Within a session, the reduction rate of the scaling error E, across J trials will be a measure for "successful" adaptation. The scaling error will first be computed during initial diagnostic trials prior to training. The aim of the training session will be to reduce the mismatch in scaling error between the normative data and the real time data, and/or between the "unaffected hand" and the "affected hand" which will be the reference hand data. The peak LFR or peak GFR at each training trial will be compared with the average error for the given weight of the object from the "normal" trials. A mismatch of >50% will signal augmentation of visual feedback on the level of grip force and prompt the individual to practice with the unaffected hand before practice with the affected hand until the mismatch is reduced to <10%. If there is no unaffected hand as the reference hand, mismatch will lead to trigger of vibrotactile actuators in multiple locations on the flexor and extensor aspect across the wrist, the elbow and back (see below). Testing has shown that the mismatch is due to excessive coactivation which can be reduced by location specific vibrotactile stimulation.
[0056] For certain implementations, trial-to-trial variability in the magnitude of the pLFR (Fig.5B) will also be computed independent of the scaling error as the variance of pLFR divided by the weight (to remove the weight effect). Decrease in trial-to-trial variability will precede successful adaptation. Scaling error and trial-to-trial variability will be computed on the peak grip force rate (pGFR) for adaptation to texture over every 5 trials. Higher trial to trial variability is correlated with increased grip forces to increase the safety margin, which indicates difficulty with processing of sensory information from the object. Increased trial-to-trial variability by 50% compared with that from the reference hand will trigger stimulation of vibrotactile actuators in the back of the vest to activate the lower trapezius muscle (Fig. 6) to increase stochastic noise in the system to enhance sensory processing.
[0057] In one implementation, figure torques are calculated. The grip force (normal contact force) exerted by each finger, measured by the force sensors, will be used to map to the joint space in a computational hand/arm model, which will output the joint torque at each finger joint (each finger has 3 joints: the distal PIP, proximal PIP, and MCP). The kinetic redundancy of the two fingers due to the muscle-induced actuations in the closed loop will be resolved using an optimization algorithm. Normally the finger torques are greater at the proximal PIP joint compared with that in the MCP joint (Fig. 7). Greater finger torques in the MCP joints of the fingers compared to the proximal PIP joints signal an inefficient grasp and will trigger vibrotactile actuators in a cuff in the front of the wrist joint to facilitate relaxation of the muscle and change in finger position. Success will be measured by reduction in error between the MCP and PIP joint torques produced by the training hand compared with the same from the reference hand.
[0058] In one implementation, in a compensatory limb strategy is utilized. Position sensors on the sleeve at the elbow and side of the vest will indicate whether the elbow is close to the trunk or away from it. If the elbow is away from the trunk and the muscle sensor over the lateral deltoid is activated, it indicates compensation by increasing the angle of the shoulder to orient the hand to grasp the object. This degree of compensation will correlate with excessive tilting of the object. In order to correct the tilt, the user will have to learn to reduce the shoulder abduction angle by bringing the elbow position sensor and position sensor on the side of the vest closer together. When the distance between the elbow and the vest is greater than that in the reference database, vibro-tactile actuators in both locations will vibrate simultaneously and when the distance is reduced, these will stop vibrating indicating that the correct position has been learned.
[0059] It should be appreciated that learning across various sessions and stages of learning will be determined by the (1 ) savings in the time course of error reduction for adaptation to object weight and texture, and (2) change in the location of the finger torque from the MCP joint to the PIP joints which is necessary for precision grasp, and (3) reduction in compensation of the limb position.
[0060] In one implementation, the data will be stored and/or displayed in real time to an electronic medical record in the form of trial-to-trial data (graphs), or average data from a training session (in the form of plots and charts). Alternatively, the data may be stored locally with the user, such as in memory associated with a user device.
[0061] While the individual interfaces directly with the objects through a novel interactive gaming environment, clinicians can monitor performance measures, which will be used for both immediate expert feedback and patient training records. This will enable analysis of the training session online or offline.
[0062] In one implementation, a wearable feedback device is provided. The wearable feedback device may be a vibro-tactile feedback device, embedded in one or more of a dorsal glove, cuff, sleeve, or vest. Snug-fitting band, sleeve and vest available in various sizes will have pockets for placement of position sensors, vibro-tactile actuators and wireless transmitters in specific locations indicated in Figure 8 and Figure 9. The actuators will be located on the flexor and extensor aspect and will turn on or off based on the output of the software algorithms. In one implementation, the actuators will communicate wirelessly with the electronic interface from the sensorized object and the gaming platform providing an interactive experience and feedback to aid in rehabilitation.
[0063] Certain implementations include methods to facilitate hand-object interactions and rehabilitation. It is believed that the systematic facilitation of controlled hand-object interactions assists in the formation of specific sensory-motor associations in neurologically impaired individuals. The configuration of the training algorithms in conjunction with information from the various sensors will prevent the use of compensatory strategies such as gripping the object too tightly or tilting it too much by providing tactile stimulation in key areas. It will facilitate practice with one or both hands separately, alternately, or simultaneously. Tasks and stimulus features can be programmed and presented to the individual using virtual reality and updated based on trial-to-trial performance. These methods can be used in conjunction with peripheral or central electric stimulation to reinforce new movement patterns.
[0064] In one application, the device may be utilized with a health individual for training rather than rehabilitation purposes. Ideal placement of a hand, for example, can be modeled and feedback provided when the hand varies from this model. As on example, placement of hands during piano playing can be monitored to provided feedback on physical orientation and placement of the hands that may not be evident merely from the notes being played.
[0065] In one embodiment, shown in Fig. 10, a system 100 is provided. Figure 10 shows an exemplary block diagram of an exemplary embodiment of a system 100 according to the present disclosure. For example, an exemplary procedure in accordance with the present disclosure can be performed by a processing arrangement 1 10 and/or a computing arrangement 1 10. Such processing/computing arrangement 1 10 can be, e.g., entirely or a part of, or include, but not limited to, a computer/processor that can include, e.g., one or more microprocessors, and use instructions stored on a computer-accessible medium (e.g., RAM, ROM, hard drive, or other storage device).
[0066] As shown in Figure 10, e.g., a computer-accessible medium 120 (e.g., as described herein, a storage device such as a hard disk, floppy disk, memory stick, CD- ROM, RAM, ROM, etc., or a collection thereof) can be provided (e.g., in communication with the processing arrangement 1 10). The computer-accessible medium 120 may be a non-transitory computer-accessible medium. The computer-accessible medium 120 can contain executable instructions 130 thereon. In addition or alternatively, a storage arrangement 140 can be provided separately from the computer-accessible medium 120, which can provide the instructions to the processing arrangement 1 10 so as to configure the processing arrangement to execute certain exemplary procedures, processes and methods, as described herein, for example.
EXPERIMENTAL RESULTS
[0067] The co-ordination of fingertip motion and forces during object manipulation developed by Johansson et.al has served as a model system of sensorimotor integration for more than 30 years, and is a sensitive test of fine motor control in various patient populations. Experiments were built from this work to understand the mechanisms of hand motor impairment after stroke and design physiologically-based rehabilitation approaches. The following experiments test the 'alternate hand training strategy' which involves practice with the unaffected hand and then the affected hand, to facilitate adaptation, repetition and relearning to restore hand function in stroke patients. The first step is to determine how individuals adapt their grasp under various sensory constraints, i.e. presence or absence of kinesthetic, visual and tactile sensory modalities. Then one can test the conditions under which the alternate hand training strategy restores adaptation or provide a benchmark. Training Methods
[0068] A foam coating for the fingers effectively impairs tactile sensibility and eliminates 2-point discrimination in healthy individuals. One experimental protocol was fully developed as show below(Table 1 below).
Kinesthetic Vision li ictile Experimental Conditions
1 1 1 1 full vision
2 1 0 1 vision occluded
3 0 1 1 no lift
4 0 0 1 vision occluded + no lift
5 1 1 0 foam fingers
6 1 0 0 vision occluded + foam fingers
7 0 1 0 no lift + foam fingers
8 0 0 0 no lift + vision occluded + foam fingers
Table 1 Experimental protocol for Aim 1
[0069] Linearity between variables measuring grasp adaptation and object weight and texture and select weight and texture pairs. Figure 19A-B show a linear relationship between the peak load force rate and object weight for a wide range of weights (n=10), and between peak grip force rate and the friction at the grip surface (measured by the co-efficient of friction, COF) for a wide range of textures (n = 14). Equivalent weight and texture pairs were selected and their presentation randomized in the experimental protocol. Figure 20A-B show that the selected weights and texture pairs show relatively similar differences in peak force rates.
[0070] Minimize the noise in grip force data due to changes in humidity and temperature. Grip forces are extremely sensitive to small changes in humidity and temperature. The ridges on the fingertips serve to increase the coefficient of friction at the grip surface which then requires smaller grip forces to grasp the same object. This poses a challenge when subjects are performing repeated lifts over a period of time even when room temperature and humidity are controlled. Therefore a thin films (e.g. Tegaderm) may be applied over the fingertips which would not impair sensibility. Tactile sensibility can be measured by examining the correlation between the coefficient of friction and the grip force rate (as shown in Fig. 19A-B). The coefficient of friction can be determined for each individual by computing the Slip Ratio [1/ (ratio of grip force to load force)]. Figures 21A-C show the relationship between the slip ratio and the peak grip force rates for a wide range of textures. Note that the slip ratio is in a narrower range when using bare hands and when grasping the objects with a thin film of Tegaderm over the fingertips. The stronger correlation between the slip ratio and the peak grip force rate suggests that tactile sensibility was not impaired by Tegaderm. However the foam coating clearly impaired tactile sensibility as can be seen from the significantly weaker correlation.
Preliminary Results:
1 The alternate hand practice strategy can restore adaptation of fingertip forces after stroke.
[0071] Adaptation with the affected hand is impaired after stroke, but that it can be temporarily restored after prior practice with the unaffected hand. The mechanisms underlying lack of adaptation of fingertip forces to object weight [measured by the difference in peak load force rate (pLFR) for two weights] were examined by considering muscle activity using surface and intramuscular (from FCU, ECRL & BRD muscles to avoid cross-talk) electrodes. 14 patients with post-stroke hemiparesis and age-matched controls lifted an instrumented grip object equipped with force sensors. On the first lift, the pLFR was not scaled to weight either in controls or in patients (Fig. 13A, light and heavy bars of similar height), suggesting that the association between object weight and fingertip forces was not learned yet. However, by the 5th lift, controls clearly scaled the pLFR showing evidence of adaptation, but patients did not. Both patients and controls lifted a range of weights in this manner. Then pLFR was correlated with muscle activity for the 5th lift. In controls, the pLFR was highly correlated only with activity in the lifting muscle - the anterior deltoid (aDEL) (Fig. 13B), since the object was lifted by flexing the shoulder. In contrast, patients showed high correlation with activity in many opposing muscles. When patients first practiced lifting the grip object with their unaffected hand (5 trials) and then with their affected hand (alternate hand strategy), they were able to scale the pLFR on the first trial with the affected hand with corresponding restoration of selective muscle activation patterns as in controls (Fig. 13C). This occurred, however, only when the same action (congruent action) was performed with each hand. These results suggest that successful adaptation to object weight with the affected hand requires accurate kinesthetic information from lifting actions. However, when subjects lifted familiar objects of different weights (e.g. water bottle, soda can) that provided additional tactile and visual information regarding their weight using the alternate hand strategy, neither controls nor patients transferred the ability to scale their pLFR on the first lift with the affected hand. Patients and controls used congruent actions with both hands and were explicitly told that the weight of the objects were the same for each hand. The results suggest that competition among multiple sensory contexts can interfere with the acquisition of accurate internal representations for adaptation to one sensory context. It is believed that training protocols for sensorimotor adaptation must initially be structured within single sensory contexts for successful learning. The above results suggest that restoration of adaptation requires accurate modality specific sensory information, which can be obtained from the unaffected hand.
2. The role of postural muscle activation on movement efficiency
[0072] o understand the role of postural muscle activation on movement efficiency 31 expert pianists with and without symptoms of overuse injury were studied. Surface EMG was recorded from 14 upper limb muscles when they played octaves at baseline and with biofeedback-assisted activation of the lower trapezius (LT). Symptomatic pianists (n=1 1 ) showed reduced activation in the upper and lower trapezius and overactivation in their finger extensor muscle (EDC) at baseline compared to asymptomatic pianists (n=20) (Fig. 14A). The slope (m) between the EDC and LT (each color cluster represents trials from one subject) was markedly positive in symptomatic pianists, in contrast to a negative slope in the asymptomatic group, suggesting an inverse relationship between EDC and LT muscles. Voluntary activation of the LT muscle shifted the slope in the symptomatic group closer to that of the asymptomatic group compared to Baseline (Fig. 14B, see starred plot), suggesting reduced activity in the EDC muscle and greater efficiency of finger movements. Thus, activation of anti-gravity postural muscles, i.e. the lower trapezius, can improve movement efficiency.
[0073] Based on the above results, further experiment sought to determine if a relationship exists between lower trapezius activity and excessive grasping forces. Patients with stroke have been shown to produce excessive grip forces in both the affected and unaffected hands leading to grasp inefficiency. As described below in Section 3. the subjects (each color cluster represents trials from one subject) who showed greater activity in the lower trapezius produced lower grip forces at lift (Fig. 15), i.e. just enough force to hold the object, but not too much as to squeeze it, suggesting greater grasp efficiency. No intervention was applied to voluntarily activate the lower trapezius. It is believed that grip force magnitude at lift is a meaningful measure of grasp efficiency that may be altered by changes in postural muscle activity.
3 Feasibility of alternate hand training to improve hand function post stroke.
[0074] Six subjects with post-stroke hemiparesis participated in a 4-week alternate hand training intervention. Training consisted of eight 45-minute sessions conducted twice a week for 4 weeks when patients grasped and lifted everyday objects first with their unaffected hand and then with the affected hand, in a 1 :1 alternating manner, by isolating movement at the shoulder, elbow or wrist joints. Training was progressed to more difficult grasp orientations, using heavier or lighter objects, and combining grasp and lift with transport and placing movements to simulate real world tasks. The goal of training was to attain symmetrical grasping patterns with the two hands. Subjects grasped and lifted an instrumented grip device pre- and post- intervention. Adaptation of fingertip forces to object weight was assessed by the difference in peak load force rates for the light and heavy weights (Fig. 16A, see also Fig. 13) and clearly improved post intervention. Grasp execution was assessed with the preload phase duration (PLD) which is the time taken to stabilize grasp and reflects grip-load coordination. The PLD has been found to be a robust and sensitive measure of grasp impairment post stroke that correlates with several tests of hand function. Although the patients still had considerable impairment, the PLD showed significant improvement post intervention in both the affected and unaffected hands (Fig. 16B). Subjects also showed clinical improvement in tactile sensibility, higher order sensory integration (stereognosis and 2-point discrimination), pinch strength, timing on fine motor tasks (8-13) of the Wolf Motor Function Test and quality of life measured with the Stroke Impact Scale (Table 1 ).
Figure imgf000026_0001
[0075] The above results demonstrate the feasibility of alternate hand training for clinically relevant improvement in hand function. [0076] System 100 may also include a display or output device, an input device such as a key-board, mouse, touch screen or other input device, and may be connected to additional systems via a logical network. Many of the embodiments described herein may be practiced in a networked environment using logical connections to one or more remote computers having processors. Logical connections may include a local area network (LAN) and a wide area network (WAN) that are presented here by way of example and not limitation. Such networking environments are commonplace in office- wide or enterprise-wide computer networks, intranets and the Internet and may use a wide variety of different communication protocols. Those skilled in the art can appreciate that such network computing environments can typically encompass many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. Embodiments of the invention may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
[0077] Various embodiments are described in the general context of method steps, which may be implemented in one embodiment by a program product including computer-executable instructions, such as program code, executed by computers in networked environments. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps. [0078] Software and web implementations of the present invention could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various database searching steps, correlation steps, comparison steps and decision steps. It should also be noted that the words "component" and "module," as used herein and in the claims, are intended to encompass implementations using one or more lines of software code, and/or hardware implementations, and/or equipment for receiving manual inputs.
[0079] With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for the sake of clarity.
[0080] The foregoing description of illustrative embodiments has been presented for purposes of illustration and of description. It is not intended to be exhaustive or limiting with respect to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the disclosed embodiments. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents.

Claims

WHAT IS CLAIMED:
1 An system for providing therapy or training, comprising: a wearable apparatus comprising a sensor; a vibro-tactile actuator; a wireless transmitter; wherein the vibro-tactile actuator is positioned within the wearable apparatus to be adjacent to a muscle of interest for providing therapy.
2. The system of claim 1 wherein the sensor includes a first proximity sensor and a second proximity sensor, the first proximity sensor associated with a limb portion of the wearable apparatus and the second proximity sensor associated with a trunk portion of the wearable apparatus, wherein the wearable apparatus is configured to provided an indication of when and how far the limb is away from the trunk.
3. The system of claim 1 wherein the vibro-tactile actuator is associated with a shoulder portion of the wearable apparatus and is configured to provide vibro- tactile feedback when the first proximity sensor and second proximity sensor indicate the elbow is away from the trunk.
4. The system of claim 1 , further comprising a game controller.
5. The system of claim 4, wherein the game controller comprises a game controller sensor.
6. The system of claim 5, wherein the game controller sensor comprises a force sensor and an orientation sensor.
7. The system of claim 5, wherein the wearable apparatus is in communication with the game controller.
8. The system of claim 7, further comprising a microcontroller in communication with the game controller and the wearable apparatus.
9. The system of claim 7, further comprising a computing device in communication with the microcontroller.
10. The system of claim 4, wherine the game controller comprises:
a housing having a shape;
a first force sensitive resister and a second force sensitive resistor positioned on opposite sides of the housing;
an acceleratometer;
a gyrometer; and
a wireless transmitter.
1 1 . The system of claim 10, wherein the game controller is a glove and further wherein the housing is a flexible material.
12. The system of claim 1 1 , further comprising a vibro-tactile actuator.
13. A computer-implemented machine for training or rehabilitating, comprising: a game controller having one or more sensors; a feedback mechanism; a processor; and a tangible computer-readable medium operatively connected to the processor and including computer code configured to: receive sensor information from the game controller; determine the game controller's state; and determine if the state is within an identified range of acceptable states and if not, send information to the feedback mechanism to provide corrective feedback.
14. The computer-implemented machine of claim 13, wherein the one or more
sensors include a force sensor.
15. The computer implemented machine of claim 13, wherein determining if the state is within an identified range of acceptable states further includes g modality-specific information from an unaffected body part.
16. A method of treating neurological injury comprising:
training an affected hand using modality-specific information from an unaffected body part which will stimulate activation of similar muscle groups on the affected body part.
17. The method of claim 16 further comprising calculating a scaling error associated with the difference between the force needed to grasp an object and the force actually used by the affected hand to grasp the object.
18. The method of claim 17, wherein if the scaling error is greater than 50%,
providing a signal indicating practice is to be performed with the unaffected hand.
19. The method of claim 16, further comprising calculating a trial-to-trial variability in the magnitude of a load force for the affected hand and for the unaffected hand.
20. The method of claim 19, further comprising determining if trial-to-trial variability is greater than 50% more for the affected hand than the unaffected hand and, if so, triggering stimulation of a feedback mechanism indicating activation of a stabilizing muscle.
PCT/US2014/038124 2013-05-16 2014-05-15 Game-based sensorimotor rehabilitator Ceased WO2014186537A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP14798022.1A EP2996551A4 (en) 2013-05-16 2014-05-15 Game-based sensorimotor rehabilitator
US14/942,971 US10299738B2 (en) 2013-05-16 2015-11-16 Game-based sensorimotor rehabilitator

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361824258P 2013-05-16 2013-05-16
US61/824,258 2013-05-16

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/942,971 Continuation-In-Part US10299738B2 (en) 2013-05-16 2015-11-16 Game-based sensorimotor rehabilitator

Publications (1)

Publication Number Publication Date
WO2014186537A1 true WO2014186537A1 (en) 2014-11-20

Family

ID=51898859

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/038124 Ceased WO2014186537A1 (en) 2013-05-16 2014-05-15 Game-based sensorimotor rehabilitator

Country Status (2)

Country Link
EP (1) EP2996551A4 (en)
WO (1) WO2014186537A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105700689A (en) * 2016-03-17 2016-06-22 北京工业大学 Personalized MI-EEG training and collecting method based on mirror image virtualization and Skinner reinforced learning
WO2016184935A3 (en) * 2015-05-19 2016-12-29 Universite Paris Descartes Method for evaluating manual dexterity
WO2017147652A1 (en) * 2016-02-29 2017-09-08 Trifecta Brace Pty Limited Psycho-social methods and apparatus for: rehabilitation, pre-operatively and post-operatively to orthopaedic surgery
WO2018043181A1 (en) * 2016-08-31 2018-03-08 Nihon Kohden Corporation Peg for rehabilitation
EP3248581A4 (en) * 2015-01-23 2018-10-10 Neofect Co. Ltd. Hand rehabilitation exercise system and method
US10493323B2 (en) 2017-02-23 2019-12-03 Elwha Llc Personal therapy and exercise monitoring and oversight devices, systems, and related methods
WO2022139772A1 (en) 2020-12-25 2022-06-30 Istanbul Medipol Universitesi Teknoloji Transfer Ofisi Anonim Sirketi Wearable robotic device
WO2022169949A1 (en) * 2021-02-03 2022-08-11 Ohio State Innovation Foundation Systems for collaborative interaction using wearable technology
EP3986266A4 (en) * 2019-06-21 2023-10-04 Rehabilitation Institute of Chicago D/b/a Shirley Ryan Abilitylab WEARABLE JOINT TRACKING DEVICE RELATED TO MUSCLE ACTIVITY AND ASSOCIATED METHODS

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6026322A (en) * 1991-08-07 2000-02-15 Ultramind International Limited Biofeedback apparatus for use in therapy
WO2007138598A2 (en) * 2006-06-01 2007-12-06 Tylerton International Inc. Brain stimulation and rehabilitation
US20090054067A1 (en) * 2007-08-23 2009-02-26 Telefonaktiebolaget Lm Ericsson (Publ) System and method for gesture-based command and control of targets in wireless network
WO2009050715A2 (en) * 2007-10-18 2009-04-23 Gravitx Ltd Means and method for interaction between real physical maneuvers and actions in a virtual environment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0799597B1 (en) * 1996-03-19 2000-11-22 Balance International Inc. Balance prothesis apparatus and method
US8311623B2 (en) * 2006-04-15 2012-11-13 The Board Of Trustees Of The Leland Stanford Junior University Systems and methods for estimating surface electromyography
US8475172B2 (en) * 2007-07-19 2013-07-02 Massachusetts Institute Of Technology Motor learning and rehabilitation using tactile feedback
CN101983090A (en) * 2008-04-03 2011-03-02 韩国电子通信研究院 Training apparatus and method based on motion content
WO2010085476A1 (en) * 2009-01-20 2010-07-29 Northeastern University Multi-user smartglove for virtual environment-based rehabilitation
EP2552304B1 (en) * 2010-03-31 2015-09-23 Agency For Science, Technology And Research Brain-computer interface system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6026322A (en) * 1991-08-07 2000-02-15 Ultramind International Limited Biofeedback apparatus for use in therapy
WO2007138598A2 (en) * 2006-06-01 2007-12-06 Tylerton International Inc. Brain stimulation and rehabilitation
US20090054067A1 (en) * 2007-08-23 2009-02-26 Telefonaktiebolaget Lm Ericsson (Publ) System and method for gesture-based command and control of targets in wireless network
WO2009050715A2 (en) * 2007-10-18 2009-04-23 Gravitx Ltd Means and method for interaction between real physical maneuvers and actions in a virtual environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2996551A4 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3248581A4 (en) * 2015-01-23 2018-10-10 Neofect Co. Ltd. Hand rehabilitation exercise system and method
WO2016184935A3 (en) * 2015-05-19 2016-12-29 Universite Paris Descartes Method for evaluating manual dexterity
US20190380625A1 (en) * 2015-05-19 2019-12-19 Universite Paris Descartes Method for evaluating manual dexterity
WO2017147652A1 (en) * 2016-02-29 2017-09-08 Trifecta Brace Pty Limited Psycho-social methods and apparatus for: rehabilitation, pre-operatively and post-operatively to orthopaedic surgery
CN105700689A (en) * 2016-03-17 2016-06-22 北京工业大学 Personalized MI-EEG training and collecting method based on mirror image virtualization and Skinner reinforced learning
CN105700689B (en) * 2016-03-17 2018-07-13 北京工业大学 Virtually and the personalized MI-EEG of Skinner intensified learnings is trained and acquisition method based on mirror image
JP2018033692A (en) * 2016-08-31 2018-03-08 日本光電工業株式会社 Rehabilitation pegs
WO2018043181A1 (en) * 2016-08-31 2018-03-08 Nihon Kohden Corporation Peg for rehabilitation
US10493323B2 (en) 2017-02-23 2019-12-03 Elwha Llc Personal therapy and exercise monitoring and oversight devices, systems, and related methods
US10828533B2 (en) 2017-02-23 2020-11-10 Elwha Llc Personal therapy and exercise monitoring and oversight devices, systems, and related methods
EP3986266A4 (en) * 2019-06-21 2023-10-04 Rehabilitation Institute of Chicago D/b/a Shirley Ryan Abilitylab WEARABLE JOINT TRACKING DEVICE RELATED TO MUSCLE ACTIVITY AND ASSOCIATED METHODS
US11803241B2 (en) 2019-06-21 2023-10-31 Rehabilitation Institute Of Chicago Wearable joint tracking device with muscle activity and methods thereof
WO2022139772A1 (en) 2020-12-25 2022-06-30 Istanbul Medipol Universitesi Teknoloji Transfer Ofisi Anonim Sirketi Wearable robotic device
EP4266988A4 (en) * 2020-12-25 2024-06-19 Istanbul Medipol Universitesi Wearable robotic device
WO2022169949A1 (en) * 2021-02-03 2022-08-11 Ohio State Innovation Foundation Systems for collaborative interaction using wearable technology
US12411540B2 (en) 2021-02-03 2025-09-09 Ohio State Innovation Foundation Systems for collaborative interaction using wearable technology

Also Published As

Publication number Publication date
EP2996551A1 (en) 2016-03-23
EP2996551A4 (en) 2017-01-25

Similar Documents

Publication Publication Date Title
US10299738B2 (en) Game-based sensorimotor rehabilitator
WO2014186537A1 (en) Game-based sensorimotor rehabilitator
Tran et al. Hand exoskeleton systems, clinical rehabilitation practices, and future prospects
Song et al. Activities of daily living-based rehabilitation system for arm and hand motor function retraining after stroke
Laut et al. The present and future of robotic technology in rehabilitation
Brewer et al. Poststroke upper extremity rehabilitation: a review of robotic systems and clinical results
Merians et al. Sensorimotor training in a virtual reality environment: does it improve functional recovery poststroke?
Huang et al. Recent developments in biofeedback for neuromotor rehabilitation
Amirabdollahian et al. Design, development and deployment of a hand/wrist exoskeleton for home-based rehabilitation after stroke-SCRIPT project
Sucar et al. Gesture therapy: An upper limb virtual reality-based motor rehabilitation platform
CA3170484A1 (en) System and method for determining user intention from limb or body motion or trajectory to control neuromuscular stimulation or prosthetic device operation
CN111449903A (en) Equipment for restoring upper and lower extremity movement
Palaniappan et al. Developing rehabilitation practices using virtual reality exergaming
US20150017623A1 (en) Method and apparatus for neuromotor rehabilitation using interactive setting systems
KR20170124978A (en) Device of VR Cognitive Rehabilitation Contents for Mild cognitive Impairment Patients
CN107638629A (en) A kind of double auxiliary hand function rehabilitation training systems of double feedbacks
Casadio et al. Body machine interface: remapping motor skills after spinal cord injury
Kattoju et al. Automatic improper loading posture detection and correction utilizing electrical muscle stimulation
Basteris et al. Lag–lead based assessment and adaptation of exercise speed for stroke survivors
Liu et al. Interactive torque controller with electromyography intention prediction implemented on exoskeleton robot NTUH-II
Giuffrida et al. Upper-extremity stroke therapy task discrimination using motion sensors and electromyography
Shenbagam et al. A sonomyography-based muscle computer interface for individuals with spinal cord injury
Munih et al. Rehabilitation robotics
Hoda et al. Predicting muscle forces measurements from kinematics data using kinect in stroke rehabilitation
US8016768B2 (en) Hand sensory assessment device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14798022

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2014798022

Country of ref document: EP