[go: up one dir, main page]

WO2020107113A1 - Apparel for ergonomic evaluation - Google Patents

Apparel for ergonomic evaluation Download PDF

Info

Publication number
WO2020107113A1
WO2020107113A1 PCT/CA2019/051696 CA2019051696W WO2020107113A1 WO 2020107113 A1 WO2020107113 A1 WO 2020107113A1 CA 2019051696 W CA2019051696 W CA 2019051696W WO 2020107113 A1 WO2020107113 A1 WO 2020107113A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensors
body structure
computing device
mannequin
suit
Prior art date
Application number
PCT/CA2019/051696
Other languages
French (fr)
Inventor
Xavier Edgardo CISNEROS GARCIA
Original Assignee
Magna International Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magna International Inc. filed Critical Magna International Inc.
Publication of WO2020107113A1 publication Critical patent/WO2020107113A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body

Definitions

  • the present disclosure relates to ergonomic evaluation system. More particularly, the present disclosure relates to apparel for use in ergonomic evaluation systems.
  • the process may be adjusted to reduce the reaching distance, or the process may be adjusted to provide a reaching tool that may be operated without requiring the worker to perform the stressful operation.
  • the process being evaluated is already being performed by the workers, and it is these workers that are observed and evaluated. Accordingly, the workers may be subjected to undesirable working conditions over an extended period of time to allow the evaluator to make all of the necessary evaluations, and the workers may continue to be subjected to undesirable working conditions during the time it takes for the evaluator to consider his or her observations and come to a conclusion.
  • a system for performing ergonomic evaluation includes a body structure including moveable portions thereof, wherein the body structure is in the form of a human body.
  • the system further includes a plurality of sensors disposed on the body structure at locations corresponding to pre-determined human body locations.
  • a computing device is in communication with the plurality of sensors, wherein the computing device receives location data from the plurality of sensors and automatically determines a risk level in response to the location data.
  • the body structure is a bodysuit.
  • the body structure is a mannequin.
  • the body structure includes a plurality of joints corresponding to joints of the human body and a plurality of articulating limb portions.
  • the plurality of sensors are disposed at the plurality of joints.
  • the plurality of sensors include position sensors.
  • the plurality of sensors include rotatory or gyroscope sensors.
  • the system includes LEDs disposed on the body structure, wherein the LEDs illuminate in response to the risk level.
  • the computing device is a remote computing device
  • the body structure includes a second computing device attached thereto, the second computing device in communication with the remote computing device.
  • the sensors measure at least one of twist, flexion, or tension.
  • a method for automatically performing an ergonomic evaluation includes providing a body structure in a first position, wherein the body structure includes moveable portions thereof, and where the body structure is in the form of a human body.
  • the method further includes detecting first location data of a plurality of the moveable portions in the first position from sensors disposed on the body structure.
  • the method further includes articulating the body structure into a second position.
  • the method also includes detecting second location data from the plurality of the moveable portions in the second position from the sensors.
  • the method further includes storing the first and second location data in a database of a computing device, and automatically performing a risk assessment by the computing device based on the first and second location data stored in the database.
  • the body structure is in the form of a bodysuit, and the method includes recording movement of a human wearing the bodysuit.
  • the body structure is in the form of a mannequin, and the method includes recording movement of the mannequin.
  • the method includes measuring at least one of tension, compression, stress, or strain via the sensors.
  • the method includes illuminating a plurality of LEDs disposed on the body structure in response to determining that a risk assessment corresponds to a predetermined level.
  • Figure 1 is a perspective view of an ergonomic evaluation system with a body suit including sensors disposed thereon;
  • Figure 2 is a front perspective view of a mannequin having sensors disposed thereon;
  • Figure 3 is a rear perspective view of the mannequin having the sensors, and further illustrates a computing device on the mannequin in communication with a remote computing device having a database and software for automatically performing an ergonomic evaluation;
  • Figure 4 illustrates a prior art subjective ergonomic evaluation
  • Figure 5 illustrates a worker wearing the suit
  • Figure 6 illustrates one form of the database and software of the system
  • Figures 7A-7B illustrates another form of the database and software of the system.
  • a system 10 for evaluating ergonomic conditions at work stations includes a suit 12 having a plurality of sensors 14 disposed at various locations on the suit 12, and a computing device 16 configured to receive signals from the sensors 14 and further configured to analyze data from the signals in an objective manner to make an objective ergonomic evaluation.
  • the suit 12 may be in the form of a bodysuit that may be wearable by a worker or mannequin to simulate the movements to be made at a given workstation.
  • the suit 12 may include various portions to allow the suit 12 to be placed on a user’s body in a manner sufficient to reproduce the various movements that may be made by the body and which are desired to be measured for ergonomic evaluation.
  • Portions of the suit 12 may include, but are not limited to, a torso portion, leg portions, foot portions, arm portions, a head portion, an eye portion, and hand portions. It will be appreciated that other connecting portions may also be included that conform to other portions of the human body, and that the above list is merely illustrative, and is not intended to exclude other portions of the body that the suit 12 may cover or reproduce.
  • the suit 12 may be made in different sizes to fit various body sizes and types.
  • the suit 12 may be adjustable in some areas.
  • the suit 12 may be adjustable in areas that are located between the sensors 14 that are disposed at different locations of the suit 12. Accordingly, the suit 12 may be adjusted in these between-sensor locations in order to locate the sensors 14 at the desired location on the body.
  • the suit 12 may be a single component. In this aspect, the suit
  • the suit 12 may include zippers, buttons, or the like to secure the suit 12 to the user’s body.
  • the suit 12 may be in the form of multiple pieces that are attached to different portions of the body separately.
  • the suit 12 may include separate portions for the legs and may be in the form of pants, or the suit may have separate sleeves for use with individual legs of the user’s body.
  • the suit 12 may include a shirt portion covering the torso and arms, or the suit may have a separate torso portion and separate arm portions.
  • the suit 12 may include a separate component to be worn on the head, such as a hat or mask.
  • the suit 12 may include separate portions for the feet, such as shoes.
  • the suit 12 may be made of any flexible clothing type material that will move in accordance with body movements in order to reproduce and simulate the position of the body and various joints and other portions of the body when the body is performing a manual task.
  • the suit 12 may be made of an elastene material that is expandable around portions of the body in response to being placed over the body.
  • the suit 12 may also be made of other material types.
  • the suit 12 may include different material types at different parts of the body. For example, a flexible material may be used on the torso, arms, and legs, with a more rigid or non-conforming material used atop the user’s head, such as a typical hat material.
  • the suit 12 may be configured to measure the locations of various portions of the human body.
  • the suit 12 may be configured to measure locations of the neck, shoulder, back, hip, knee, elbow, wrist, fingers, and foot.
  • locations on the body may also be measured, and in particular locations with joints that will bend and flex to produce the movement of the human body that occurs in a work station environment.
  • a horizontal and vertical location may be measured at different times for an individual one of the sensors, as well as a travel distance. This measurement may be performed for each of the other sensors 14, creating a kinematic record of movement of the body relative to previous postures.
  • the sensors 14 may be placed on the suit 12 at a variety of locations, and these sensors 14 may produce 2D and 3D location data.
  • the sensors 14 may produce this location data at various times throughout an evaluation period, and the positions detected by the sensors 14 may be combined to simulate the movement of the body over a period of time.
  • the sensors 14 may be located, in one approach, at various joints of the body and therefore at various locations on the suit 12 corresponding to the body joints where location measurements are desired.
  • the sensors 14 may be placed at the ankle, knee, hip, back, neck, shoulder, elbow, wrist, and fingers.
  • Sensors 14 may also be placed at the toes, the top of the head, at the eyes, or at the ears.
  • the sensors 14 may be in the form of rotatory or gyroscope sensors, and can therefore produce data associated with rotation as well as position.
  • Multiple sensors 14 may be evaluated relative to each other to produce a measurement or reproduction of a body position.
  • the shoulder, elbow, and wrist may be evaluated relative to each other, and the respective positions between these particular sensors 14 can indicate whether the arm is straight or bent, or the degree to which the arm is bent. For example, those three points may be detected, and the extent to which they are aligned or the extent to which they define a triangle may determine the degree to which the arm is bent.
  • the position of the wrist or elbow sensor 14 relative to the shoulder sensor 14 may be used to determine whether the arm is raised. Similar relative positions may be detected to determine similar bending movements or positions of the legs. It will be appreciated the data from all of the sensors 14 may be used to determine the general locations of each of the predefined joints, and based on this information, it can be determined the orientation and movement of the body.
  • the suit 12 may be in the form of a mannequin 112, as shown in Figures 2 and 3, either by placing the flexible suit 12 on a mannequin body, or by placing the sensors 14 directly on the mannequin 112.
  • the mannequin 112 may preferably be constructed and constrained to move in the same manner as a traditional human body.
  • the mannequin 112 may be restricted such that knees and elbows generally bend in a single direction, and such that the neck and head cannot spin fully around.
  • the sensors 14 may be placed at the knee, hip, elbow, and neck, or other joints in which bending or twisting or rotating may occur, and the sensors 14 may be configured to detect rotation or twisting independently. Put another way, a single one of the sensors 14 may determine whether a limb is straight or bent without input from an adjacent sensor.
  • the mannequin 112 may also include sensors 14 that simply measure 2D or 3D position, and may be used in combination with the other sensors 14 to determine relative movement and bending/rotation.
  • the sensors 14 may also be able to measure extension, tension, compression, stress, and strain at a particular location in addition to relative movement and position.
  • the suit 12 or mannequin 112 may include a plurality of LED indicators 18 at some or all of the joints or locations where the sensors 14 are located.
  • the LED indicators are shown in limited locations in Figures 2 and 3, but it will be appreciated that the indicators 18 may be placed at each sensor location. Similarly, the sensors 14 may be placed at additional locations not illustrated in the figures.
  • the indicators 18 may be configured to produce different colors depending on the state of the sensor 14 and to indicate whether the position or movement of the body at the location of the sensor 14 is a high risk. For example, green can be used for measurements deemed in the range of acceptable, yellow can be used for moderate risk, red can be used for high risk, and orange can be used for very high risk.
  • the determination of risk level may be predetermined and tailored for a particular joint or body location, with a detected value determining which of the colors to produce based on reaching various threshold levels.
  • the risk level may be based on fixed thresholds or may be the result of inputting data from the sensors into a formula such as a formula developed by the National Institute for Occupational Safety and Health (NIOSH).
  • the suit 12 or mannequin 112 may include a computing device 40 ( Figures 2 and 3) in communication with the sensors 14.
  • the computing device 40 may be in the form of a tablet computer with touchscreen inputs that may be used to control the activity and improve ease of use.
  • other computing device types may be included depending on the needs of the user.
  • the computing device 40 may be in communication, such as wireless or wired communication, with a remote computing device 50 ( Figure 3) having a database 52 and software 54 associated with the database 52 configured to evaluate the measurements received from the sensors 14 via the computing device 40.
  • the sensors 14 may communicate directly with the remote computing device 50.
  • the sensors 14 may be configured to detect different types of movements and/or locations. Different sensors 14 at different locations may measure different aspects. The following is a non-exhaustive list of different measurements that can be taken at different sensor 14 locations.
  • the neck sensor 14 may measure twist, flexion, and/or extension.
  • the hip sensor 14 may measure twist and/or rotation.
  • the finger sensor 14 may measure flexion and/or extension.
  • the back sensor 14 may measure flexion and/or extension.
  • the wrist sensor 14 may measure twist, flexion, and extension.
  • the knee sensor 14 may measure twist, flexion, and extension.
  • the foot sensor 14 may measure twist, flexion, and/or extension.
  • the leg sensor 14 may measure twist, flexion, and/or extension.
  • the elbow sensor 14 may measure twist, flexion, or extension.
  • the shoulder sensor 14 may measure twist and/or rotation.
  • the above sensors 14 may also measure other aspects of movement, and there may be additional sensors not listed placed on the suit 12 or mannequin 112 to measure other movement aspects if
  • the mannequin 112 may thereby be adjusted at the target work station and may simulate the tasks that are performed at the work station, thereby subjecting the mannequin 112 to the various positions required to carry out the activity at the work station.
  • the data produced by the sensors 14 may be collected automatically by one or both of the computing device 40 and the remote computing device 50, and stored in the database 52 and processed by the software 54 to make an automatic determination and risk assessment.
  • the software 54 may use the data from the database 52 and apply the data to a predetermined ergonomic risk assessment program (ERE) or a NIOSH equation.
  • ERP ergonomic risk assessment program
  • the collected data therefore can produce objective results and can be performed without the subjective perspective of an individual evaluator.
  • the software 54 may also provide an output that is then stored in the database 52 along with the measurements from the sensors 14.
  • the software may determine a risk level (such as one of the color-coded risk levels described above), and the risk level may be stored in the database 52 along with the measurements to identify particular measurements with the resulting risk level.
  • a worker may wear the suit 12, as shown in Figure 5, and the worker’s posture may be traced for each activity that is performed.
  • the data from the sensors 14 acquired from the posture trace may then be automatically obtained and stored in the database 52, and the software 54 may evaluate the data relative to a pre-determined standard to determine or indicate the elevation of each joint that is within the lowest risk range.
  • Figure 6 illustrates a database that may include various measurements and data for various tasks, and may include color-coded risk assessments based on the objective determinations of the software 54.
  • Figures 7A-7B further illustrates other databases 52 and software 54 that may evaluate the measurements from the sensors 14 and evaluate the measurements objectively according to a predetermined standard or formula.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Physiology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A system for automatically performing an ergonomic evaluation includes a body structure in the form of a suit for wearing by a worker or a mannequin that may be simulated to perform the same tasks as a worker. The body structure may include sensors disposed at various locations, such as body joints. The body structure may be placed in multiple positions corresponding to movements performed at a work station. Data from the sensors may be stored in a database, and a computing device may automatically evaluate the measurements stored in the database and perform an objective evaluation and risk assessment based on the stored measurement data.

Description

APPAREL FOR ERGONOMIC EVALUATION
TECHNICAL FIELD
[0001] The present disclosure relates to ergonomic evaluation system. More particularly, the present disclosure relates to apparel for use in ergonomic evaluation systems.
BACKGROUND OF THE DISCLOSURE
[0002] Industrial manufacturing processes, such as those used in the automotive industry, have long utilized the labors of individuals for performing various tasks. For example, men and women may manually operate different machinery or equipment on an assembly line or on the manufacturing floor. They may also be tasked with lifting, moving, and installing various components, or operating mechanisms that perform these functions. These various tasks may be performed in a variety of body positions, many of which can lead to stresses on the body, and may often lead to repetitive stress injuries.
[0003] Manufacturers and employers have made improvements and developments to the workplace for these workers, by developing processes that reduce the stress on the human body through research and evaluation of the working environment. In instances where a stressful body position is evaluated and determined, the process may be adjusted to reduce the probability of these injuries by reducing the stress that may occur.
[0004] For example, if a worker must extend their arms repeatedly to reach a component from stock, this repetitive reaching may lead to future arm injuries, in particular at one of the arm joints, such as the shoulder, elbow, and/or wrist. By evaluating the process and determining that the reaching distance may be problematic, the process may be adjusted to reduce the reaching distance, or the process may be adjusted to provide a reaching tool that may be operated without requiring the worker to perform the stressful operation.
[0005] While it is beneficial to re-design a stressful working environment for various workers, it can be difficult and inefficient to determine which setups and environments will lead to an increased potential for injury. One method of determining an undesirable setup requires ergonomic study and evaluation by a person operating as an ergonomic evaluator. The evaluator will study a setup, observe the working conditions, and make a determination on the injury potential. However, these evaluations are subjective, depending on the training of the evaluator and the evaluator’s opinion. The evaluator may take photos or video of the operation, and make determinations based on this record. The determination is also dependent on the evaluator’s schedule and the amount of time it takes the evaluator to come to a conclusion.
[0006] In many cases, the process being evaluated is already being performed by the workers, and it is these workers that are observed and evaluated. Accordingly, the workers may be subjected to undesirable working conditions over an extended period of time to allow the evaluator to make all of the necessary evaluations, and the workers may continue to be subjected to undesirable working conditions during the time it takes for the evaluator to consider his or her observations and come to a conclusion.
[0007] In view of the foregoing, there remains a need for improvements to ergonomic evaluation systems.
SUMMARY OF THE INVENTION
[0008] A system for performing ergonomic evaluation is provided. The system includes a body structure including moveable portions thereof, wherein the body structure is in the form of a human body. The system further includes a plurality of sensors disposed on the body structure at locations corresponding to pre-determined human body locations. A computing device is in communication with the plurality of sensors, wherein the computing device receives location data from the plurality of sensors and automatically determines a risk level in response to the location data.
[0009] In one aspect, the body structure is a bodysuit.
[0010] In one aspect, the body structure is a mannequin.
[0011] In one aspect, the body structure includes a plurality of joints corresponding to joints of the human body and a plurality of articulating limb portions.
[0012] In one aspect, the plurality of sensors are disposed at the plurality of joints.
[0013] In one aspect, the plurality of sensors include position sensors.
[0014] In one aspect, the plurality of sensors include rotatory or gyroscope sensors.
[0015] In one aspect, the system includes LEDs disposed on the body structure, wherein the LEDs illuminate in response to the risk level.
[0016] In one aspect, the computing device is a remote computing device, and the body structure includes a second computing device attached thereto, the second computing device in communication with the remote computing device.
[0017] In one aspect, the sensors measure at least one of twist, flexion, or tension.
[0018] According to another aspect of the disclosure, a method for automatically performing an ergonomic evaluation is provided. The method includes providing a body structure in a first position, wherein the body structure includes moveable portions thereof, and where the body structure is in the form of a human body. The method further includes detecting first location data of a plurality of the moveable portions in the first position from sensors disposed on the body structure.
[0019] The method further includes articulating the body structure into a second position. The method also includes detecting second location data from the plurality of the moveable portions in the second position from the sensors. The method further includes storing the first and second location data in a database of a computing device, and automatically performing a risk assessment by the computing device based on the first and second location data stored in the database.
[0020] In one aspect, the body structure is in the form of a bodysuit, and the method includes recording movement of a human wearing the bodysuit.
[0021] In one aspect, the body structure is in the form of a mannequin, and the method includes recording movement of the mannequin.
[0022] In one aspect, the method includes measuring at least one of tension, compression, stress, or strain via the sensors.
[0023] In one aspect, the method includes illuminating a plurality of LEDs disposed on the body structure in response to determining that a risk assessment corresponds to a predetermined level.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] Other advantages of the present invention will be readily appreciated, as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings wherein:
[0025] Figure 1 is a perspective view of an ergonomic evaluation system with a body suit including sensors disposed thereon;
[0026] Figure 2 is a front perspective view of a mannequin having sensors disposed thereon;
[0027] Figure 3 is a rear perspective view of the mannequin having the sensors, and further illustrates a computing device on the mannequin in communication with a remote computing device having a database and software for automatically performing an ergonomic evaluation;
[0028] Figure 4 illustrates a prior art subjective ergonomic evaluation; [0029] Figure 5 illustrates a worker wearing the suit;
[0030] Figure 6 illustrates one form of the database and software of the system; and
[0031] Figures 7A-7B illustrates another form of the database and software of the system.
DESCRIPTION OF THE ENABLING EMBODIMENT
[0032] Referring to Figure 1, a system 10 for evaluating ergonomic conditions at work stations is provided. The system 10 includes a suit 12 having a plurality of sensors 14 disposed at various locations on the suit 12, and a computing device 16 configured to receive signals from the sensors 14 and further configured to analyze data from the signals in an objective manner to make an objective ergonomic evaluation.
[0033] The suit 12 may be in the form of a bodysuit that may be wearable by a worker or mannequin to simulate the movements to be made at a given workstation. The suit 12 may include various portions to allow the suit 12 to be placed on a user’s body in a manner sufficient to reproduce the various movements that may be made by the body and which are desired to be measured for ergonomic evaluation. Portions of the suit 12 may include, but are not limited to, a torso portion, leg portions, foot portions, arm portions, a head portion, an eye portion, and hand portions. It will be appreciated that other connecting portions may also be included that conform to other portions of the human body, and that the above list is merely illustrative, and is not intended to exclude other portions of the body that the suit 12 may cover or reproduce.
[0034] The suit 12 may be made in different sizes to fit various body sizes and types. The suit 12 may be adjustable in some areas. For example, the suit 12 may be adjustable in areas that are located between the sensors 14 that are disposed at different locations of the suit 12. Accordingly, the suit 12 may be adjusted in these between-sensor locations in order to locate the sensors 14 at the desired location on the body. [0035] In one aspect, the suit 12 may be a single component. In this aspect, the suit
12 may include zippers, buttons, or the like to secure the suit 12 to the user’s body. In another aspect, the suit 12 may be in the form of multiple pieces that are attached to different portions of the body separately. For example, the suit 12 may include separate portions for the legs and may be in the form of pants, or the suit may have separate sleeves for use with individual legs of the user’s body. Similarly, the suit 12 may include a shirt portion covering the torso and arms, or the suit may have a separate torso portion and separate arm portions. The suit 12 may include a separate component to be worn on the head, such as a hat or mask. The suit 12 may include separate portions for the feet, such as shoes.
[0036] The suit 12 may be made of any flexible clothing type material that will move in accordance with body movements in order to reproduce and simulate the position of the body and various joints and other portions of the body when the body is performing a manual task. For example, the suit 12 may be made of an elastene material that is expandable around portions of the body in response to being placed over the body. The suit 12 may also be made of other material types. The suit 12 may include different material types at different parts of the body. For example, a flexible material may be used on the torso, arms, and legs, with a more rigid or non-conforming material used atop the user’s head, such as a typical hat material.
[0037] The suit 12 may be configured to measure the locations of various portions of the human body. For example, the suit 12 may be configured to measure locations of the neck, shoulder, back, hip, knee, elbow, wrist, fingers, and foot. It will be appreciated that other locations on the body may also be measured, and in particular locations with joints that will bend and flex to produce the movement of the human body that occurs in a work station environment. For example, a horizontal and vertical location may be measured at different times for an individual one of the sensors, as well as a travel distance. This measurement may be performed for each of the other sensors 14, creating a kinematic record of movement of the body relative to previous postures.
[0038] To allow the suit 12 to produce measurements and data of the body, the sensors 14 may be placed on the suit 12 at a variety of locations, and these sensors 14 may produce 2D and 3D location data. The sensors 14 may produce this location data at various times throughout an evaluation period, and the positions detected by the sensors 14 may be combined to simulate the movement of the body over a period of time.
[0039] The sensors 14 may be located, in one approach, at various joints of the body and therefore at various locations on the suit 12 corresponding to the body joints where location measurements are desired. For example, the sensors 14 may be placed at the ankle, knee, hip, back, neck, shoulder, elbow, wrist, and fingers. Sensors 14 may also be placed at the toes, the top of the head, at the eyes, or at the ears. The sensors 14 may be in the form of rotatory or gyroscope sensors, and can therefore produce data associated with rotation as well as position.
[0040] Multiple sensors 14 may be evaluated relative to each other to produce a measurement or reproduction of a body position. For example, the shoulder, elbow, and wrist may be evaluated relative to each other, and the respective positions between these particular sensors 14 can indicate whether the arm is straight or bent, or the degree to which the arm is bent. For example, those three points may be detected, and the extent to which they are aligned or the extent to which they define a triangle may determine the degree to which the arm is bent. Similarly, the position of the wrist or elbow sensor 14 relative to the shoulder sensor 14 may be used to determine whether the arm is raised. Similar relative positions may be detected to determine similar bending movements or positions of the legs. It will be appreciated the data from all of the sensors 14 may be used to determine the general locations of each of the predefined joints, and based on this information, it can be determined the orientation and movement of the body.
[0041] In one approach, the suit 12 may be in the form of a mannequin 112, as shown in Figures 2 and 3, either by placing the flexible suit 12 on a mannequin body, or by placing the sensors 14 directly on the mannequin 112. In the case of the mannequin 112, the mannequin 112 may preferably be constructed and constrained to move in the same manner as a traditional human body. For example, the mannequin 112 may be restricted such that knees and elbows generally bend in a single direction, and such that the neck and head cannot spin fully around.
[0042] In the case of the mannequin 112, the sensors 14 may be placed at the knee, hip, elbow, and neck, or other joints in which bending or twisting or rotating may occur, and the sensors 14 may be configured to detect rotation or twisting independently. Put another way, a single one of the sensors 14 may determine whether a limb is straight or bent without input from an adjacent sensor. However, the mannequin 112 may also include sensors 14 that simply measure 2D or 3D position, and may be used in combination with the other sensors 14 to determine relative movement and bending/rotation.
[0043] The sensors 14 may also be able to measure extension, tension, compression, stress, and strain at a particular location in addition to relative movement and position.
[0044] In addition to the sensors 14, the suit 12 or mannequin 112 may include a plurality of LED indicators 18 at some or all of the joints or locations where the sensors 14 are located. The LED indicators are shown in limited locations in Figures 2 and 3, but it will be appreciated that the indicators 18 may be placed at each sensor location. Similarly, the sensors 14 may be placed at additional locations not illustrated in the figures.
[0045] The indicators 18 may be configured to produce different colors depending on the state of the sensor 14 and to indicate whether the position or movement of the body at the location of the sensor 14 is a high risk. For example, green can be used for measurements deemed in the range of acceptable, yellow can be used for moderate risk, red can be used for high risk, and orange can be used for very high risk. The determination of risk level may be predetermined and tailored for a particular joint or body location, with a detected value determining which of the colors to produce based on reaching various threshold levels. The risk level may be based on fixed thresholds or may be the result of inputting data from the sensors into a formula such as a formula developed by the National Institute for Occupational Safety and Health (NIOSH).
[0046] The suit 12 or mannequin 112 may include a computing device 40 (Figures 2 and 3) in communication with the sensors 14. In one approach, the computing device 40 may be in the form of a tablet computer with touchscreen inputs that may be used to control the activity and improve ease of use. However, other computing device types may be included depending on the needs of the user.
[0047] The computing device 40 may be in communication, such as wireless or wired communication, with a remote computing device 50 (Figure 3) having a database 52 and software 54 associated with the database 52 configured to evaluate the measurements received from the sensors 14 via the computing device 40. In another approach, the sensors 14 may communicate directly with the remote computing device 50.
[0048] As described above, the sensors 14 may be configured to detect different types of movements and/or locations. Different sensors 14 at different locations may measure different aspects. The following is a non-exhaustive list of different measurements that can be taken at different sensor 14 locations. The neck sensor 14 may measure twist, flexion, and/or extension. The hip sensor 14 may measure twist and/or rotation. The finger sensor 14 may measure flexion and/or extension. The back sensor 14 may measure flexion and/or extension. The wrist sensor 14 may measure twist, flexion, and extension. The knee sensor 14 may measure flexion and/or extension. The foot sensor 14 may measure twist, flexion, and/or extension. The leg sensor 14 may measure twist, flexion, and/or extension. The elbow sensor 14 may measure twist, flexion, or extension. The shoulder sensor 14 may measure twist and/or rotation. The above sensors 14 may also measure other aspects of movement, and there may be additional sensors not listed placed on the suit 12 or mannequin 112 to measure other movement aspects if desired.
[0049] The mannequin 112 may thereby be adjusted at the target work station and may simulate the tasks that are performed at the work station, thereby subjecting the mannequin 112 to the various positions required to carry out the activity at the work station. The data produced by the sensors 14 may be collected automatically by one or both of the computing device 40 and the remote computing device 50, and stored in the database 52 and processed by the software 54 to make an automatic determination and risk assessment.
[0050] The software 54 may use the data from the database 52 and apply the data to a predetermined ergonomic risk assessment program (ERE) or a NIOSH equation. The collected data therefore can produce objective results and can be performed without the subjective perspective of an individual evaluator. The software 54 may also provide an output that is then stored in the database 52 along with the measurements from the sensors 14. For example, the software may determine a risk level (such as one of the color-coded risk levels described above), and the risk level may be stored in the database 52 along with the measurements to identify particular measurements with the resulting risk level.
[0051] Similarly, rather than simulate the mannequin 112 at the work station, a worker may wear the suit 12, as shown in Figure 5, and the worker’s posture may be traced for each activity that is performed. The data from the sensors 14 acquired from the posture trace may then be automatically obtained and stored in the database 52, and the software 54 may evaluate the data relative to a pre-determined standard to determine or indicate the elevation of each joint that is within the lowest risk range.
[0052] The above described use of the mannequin 112 or the suit 12 thereby provides an automated measurement process that cannot be achieved with specificity by an outside observer that subjective watches a worker perform a task. As example of such a subjective observation is shown in Figure 4, where a worker W performs various tasks, with different body positions photographed or captured, and where a subjective evaluator manually identifies different positions of the body at different risk levels. The process of Figure 4 is therefore overly time consuming and inconsistent.
[0053] Figure 6 illustrates a database that may include various measurements and data for various tasks, and may include color-coded risk assessments based on the objective determinations of the software 54. Figures 7A-7B further illustrates other databases 52 and software 54 that may evaluate the measurements from the sensors 14 and evaluate the measurements objectively according to a predetermined standard or formula.
[0054] Obviously, many modifications and variations of the present invention are possible in light of the above teachings and may be practiced otherwise than as specifically described while within the scope of the appended claims. These antecedent recitations should be interpreted to cover any combination in which the inventive novelty exercises its utility.

Claims

CLAIMS What is claimed is:
Claim 1. A system for performing ergonomic evaluation, the system comprising:
a body structure including moveable portions thereof, wherein the body structure is in the form of a human body;
a plurality of sensors disposed on the body structure at locations corresponding to pre-determined human body locations;
a computing device in communication with the plurality of sensors, wherein the computing device receives location data from the plurality of sensors and automatically determines a risk level in response to the location data.
Claim 2. The system of claim 1, wherein the body structure is a bodysuit.
Claim 3. The system of claim 1, wherein the body structure is a mannequin.
Claim 4. The system of claim 1, wherein the body structure includes a plurality of joints corresponding to joints of the human body and a plurality of articulating limb portions.
Claim 5. The system of claim 4, wherein the plurality of sensors are disposed at the plurality of joints.
Claim 6. The system of claim 1, wherein the plurality of sensors include position sensors.
Claim 7. The system of claim 1, wherein the plurality of sensors include rotatory or gyroscope sensors.
Claim 8. The system of claim 1 further comprising LEDs disposed on the body structure, wherein the LEDs illuminate in response to the risk level.
Claim 9. The system of claim 1, wherein the computing device is a remote computing device, and the body structure includes a second computing device attached thereto, the second computing device in communication with the remote computing device.
Claim 10. The system of claim 1, wherein the sensors measure at least one of twist, flexion, or tension.
Claim 11. A method for automatically performing an ergonomic evaluation: providing a body structure in a first position, wherein the body structure includes moveable portions thereof, and where the body structure is in the form of a human body; detecting first location data of a plurality of the moveable portions in the first position from sensors disposed on the body structure;
articulating the body structure into a second position;
detecting second location data from the plurality of the moveable portions in the second position from the sensors;
storing the first and second location data in a database of a computing device; and automatically performing a risk assessment by the computing device based on the first and second location data stored in the database.
Claim 12. The method of claim 11, wherein the body structure is in the form of a bodysuit, and the method includes recording movement of a human wearing the bodysuit.
Claim 13. The method of claim 11, wherein the body structure is in the form of a mannequin, and the method includes recording movement of the mannequin.
Claim 14. The method of claim 11 further comprising measuring at least one of tension, compression, stress, or strain via the sensors.
Claim 15. The method of claim 11 further comprising illuminating a plurality of LEDs disposed on the body structure in response to determining that a risk assessment corresponds to a predetermined level.
PCT/CA2019/051696 2018-11-28 2019-11-27 Apparel for ergonomic evaluation WO2020107113A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862772396P 2018-11-28 2018-11-28
US62/772,396 2018-11-28

Publications (1)

Publication Number Publication Date
WO2020107113A1 true WO2020107113A1 (en) 2020-06-04

Family

ID=70853345

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2019/051696 WO2020107113A1 (en) 2018-11-28 2019-11-27 Apparel for ergonomic evaluation

Country Status (1)

Country Link
WO (1) WO2020107113A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114788695A (en) * 2022-05-23 2022-07-26 长春电子科技学院 Human body posture detection device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2593027A1 (en) * 2004-12-14 2006-06-22 Smartex S.R.L. Devices and method for monitoring the form of three-dimensional objects
US7980141B2 (en) * 2007-07-27 2011-07-19 Robert Connor Wearable position or motion sensing systems or methods
CN104269093A (en) * 2014-10-11 2015-01-07 中国人民解放军总后勤部军需装备研究所 Testing system for dummy walking with load
US8942662B2 (en) * 2012-02-16 2015-01-27 The United States of America, as represented by the Secretary, Department of Health and Human Services, Center for Disease Control and Prevention System and method to predict and avoid musculoskeletal injuries
CA2842441C (en) * 2013-04-05 2016-05-03 The Boeing Company Creating ergonomic manikin postures and controlling computer-aided design environments using natural user interfaces
US9759539B2 (en) * 2011-05-25 2017-09-12 Korea Institute Of Science And Technology Method of motion tracking
CA3021087A1 (en) * 2016-04-13 2017-10-19 Strong Arm Technologies, Inc. Systems and devices for motion tracking, assessment, and monitoring and methods of use thereof
US20180000416A1 (en) * 2016-07-01 2018-01-04 Pawankumar Hegde Garment-based ergonomic assessment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2593027A1 (en) * 2004-12-14 2006-06-22 Smartex S.R.L. Devices and method for monitoring the form of three-dimensional objects
US7980141B2 (en) * 2007-07-27 2011-07-19 Robert Connor Wearable position or motion sensing systems or methods
US9759539B2 (en) * 2011-05-25 2017-09-12 Korea Institute Of Science And Technology Method of motion tracking
US8942662B2 (en) * 2012-02-16 2015-01-27 The United States of America, as represented by the Secretary, Department of Health and Human Services, Center for Disease Control and Prevention System and method to predict and avoid musculoskeletal injuries
CA2842441C (en) * 2013-04-05 2016-05-03 The Boeing Company Creating ergonomic manikin postures and controlling computer-aided design environments using natural user interfaces
CN104269093A (en) * 2014-10-11 2015-01-07 中国人民解放军总后勤部军需装备研究所 Testing system for dummy walking with load
CA3021087A1 (en) * 2016-04-13 2017-10-19 Strong Arm Technologies, Inc. Systems and devices for motion tracking, assessment, and monitoring and methods of use thereof
US20180000416A1 (en) * 2016-07-01 2018-01-04 Pawankumar Hegde Garment-based ergonomic assessment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114788695A (en) * 2022-05-23 2022-07-26 长春电子科技学院 Human body posture detection device

Similar Documents

Publication Publication Date Title
MassirisFernández et al. Ergonomic risk assessment based on computer vision and machine learning
US10869632B2 (en) System and method for ergonomic analysis, in particular of a worker
Humadi et al. In-field instrumented ergonomic risk assessment: Inertial measurement units versus Kinect V2
JP7346791B2 (en) Regarding ergonomic analysis of hands, in particular gloves equipped with sensors for ergonomic analysis of workers' hands and corresponding methods
Valero et al. Musculoskeletal disorders in construction: A review and a novel system for activity tracking with body area network
Lee et al. Automated evaluation of upper-limb motor function impairment using Fugl-Meyer assessment
Lorenzini et al. An online multi-index approach to human ergonomics assessment in the workplace
Steinebach et al. Accuracy evaluation of two markerless motion capture systems for measurement of upper extremities: Kinect V2 and Captiv
JP2021128794A (en) Stereoscopic image generation system, stereoscopic image generation method, and stereoscopic image generation program
Ellegast et al. Workload assessment in field using the ambulatory CUELA system
JP6629556B2 (en) Process composition support device and process composition support method
WO2020107113A1 (en) Apparel for ergonomic evaluation
JP6273141B2 (en) Multi-faceted evaluation system for working posture
Klepser et al. Functional measurements and mobility restriction (from 3D to 4D scanning)
US9949685B2 (en) Instrumented sleeve
Hida et al. Work postural ergonomic assessment using two-dimensional joint coordinates
Tognetti et al. Wearable kinesthetic systems for capturing and classifying body posture and gesture
Adams et al. Methods for assessing protective clothing effects on worker mobility
Ng et al. Dynamic ease allowance in arm raising of functional garment
Lee et al. Reliability and validity of a posture matching method using inertial measurement unit-based motion tracking system for construction jobs
KR20160118678A (en) Workload estimation and job rotation scheduling for Work-related Musculoskeletal Disorders prevention
Kim et al. Thumb joint angle estimation for soft wearable hand robotic devices
Lee et al. Motion tracking smart work suit with a modular joint angle sensor using screw routing
Griffin et al. 3D hand scanning to digital draping for glove design
JP6045338B2 (en) Posture sensation system and posture evaluation system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19888302

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19888302

Country of ref document: EP

Kind code of ref document: A1