[go: up one dir, main page]

 
 
sensors-logo

Journal Browser

Journal Browser

Smart Sensors: Applications and Advances in Human Motion Analysis

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (31 October 2021) | Viewed by 53474

Special Issue Editors


E-Mail Website
Guest Editor

E-Mail Website
Guest Editor
Postdoctoral researcher, Center for MicroElectroMechanical Systems (CMEMS), University of Minho, Minho, Portugal
Interests: gait rehabilitation robotics; wearable motion sensors; gait analysis; human motion recognition

Special Issue Information

Dear Colleagues,

New directions in human motion cover motion recognition and prediction, and human-robot interaction perception using sensor-based technologies driven by recent technological advances (wearable sensors, advanced sensors, artificial intelligence and machine learning, electronic and smart sensing textiles, and so on). Advanced applications in both scenarios rely on the combination of smart sensors with the algorithmic advances in artificial intelligence.  

The adequate provision of human motion requires the consideration of various aspects, as follows. The use of unobtrusive, low-cost wearable sensors that are capable of tracking relevant motion in free-living conditions; machine learning-based strategies for reasoning sensor data; intuitive and collaborative sensor-based technologies for timely assisting and interacting with users in various environments. The accurate decision making of human motion may bring new achievements in diverse robotic applications. The inclusion of motion prediction strategies in robotic assistive devices is necessary to provide patients with personalized motor assistance and to prevent risk situations. Moreover, the collaborative robots, used in Industry 4.0 programs and social robots, will benefit if the robot is being continuously kept informed of the human motor performance and safety.  

This Special Issue covers new strategies to recognize and predict the human motion or the human-robot interaction, both in the clinical and in the industry fields, thanks to the application of smart sensors or the innovative use of the standard wearable sensors. Biofeedback strategies-related sensors to augment human collaboration with robotic systems are also encouraged.

Contributions may include, but are not limited to:

  • Smart sensors for human motion analysis;
  • Sensors for decision making and smart-based applications;
  • Wearable sensor-based strategies for motion intention recognition;
  • Machine learning algorithms for human motion recognition and prediction;
  • Machine learning -based sensor measurements for human motion estimation;
  • Sensors applications on collaborative and assistive robots;
  • Advanced strategies for improving human-robot interaction;
  • Sensing for physical human-robot interaction;
  • Applications of sensors for robotics

You may choose our Joint Special Issue in Machines.

Dr. Cristina P. Santos
Dr. Joana Figueiredo
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (11 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

11 pages, 1928 KiB  
Communication
Stack LSTM-Based User Identification Using Smart Shoes with Accelerometer Data
by Do-Yun Kim, Seung-Hyeon Lee and Gu-Min Jeong
Sensors 2021, 21(23), 8129; https://doi.org/10.3390/s21238129 - 5 Dec 2021
Cited by 5 | Viewed by 2522
Abstract
In this study, we propose a long short-term memory (LSTM)-based user identification method using accelerometer data from smart shoes. In general, for the user identification with human walking data, we require a pre-processing stage in order to divide human walking data into individual [...] Read more.
In this study, we propose a long short-term memory (LSTM)-based user identification method using accelerometer data from smart shoes. In general, for the user identification with human walking data, we require a pre-processing stage in order to divide human walking data into individual steps. Next, user identification can be made with divided step data. In these approaches, when there exist partial data that cannot complete a single step, it is difficult to apply those data to the classification. Considering these facts, in this study, we present a stack LSTM-based user identification method for smart-shoes data. Rather than using a complicated analysis method, we designed an LSTM network for user identification with accelerometer data of smart shoes. In order to learn partial data, the LSTM network was trained using walking data with random sizes and random locations. Then, the identification can be made without any additional analysis such as step division. In the experiments, user walking data with 10 m were used. The experimental results show that the average recognition rate was about 93.41%, 97.19%, and 98.26% by using walking data of 2.6, 3.9, and 5.2 s, respectively. With the experimental results, we show that the proposed method can classify users effectively. Full article
(This article belongs to the Special Issue Smart Sensors: Applications and Advances in Human Motion Analysis)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>The Footlogger smart insole.</p>
Full article ">Figure 2
<p>Phases of a typical gait cycle.</p>
Full article ">Figure 3
<p>Exemplary accelerometer data for the experiment.</p>
Full article ">Figure 4
<p>General LSTM architecture.</p>
Full article ">Figure 5
<p>Stack LSTM architecture.</p>
Full article ">Figure 6
<p>Stack LSTM network with an input of a random size and random location for the user identification.</p>
Full article ">Figure 7
<p>Data selection with a random location and a random size.</p>
Full article ">Figure 8
<p>The designed LSTM network considering the variable window sizes.</p>
Full article ">Figure 9
<p>Stack LSTM model with window size 20 and 200.</p>
Full article ">Figure 10
<p>Dataset division into training set and test set using five-fold cross-validation.</p>
Full article ">Figure 11
<p>Test data selection for performance evaluation.</p>
Full article ">
28 pages, 5454 KiB  
Article
Wearable Inertial Measurement Unit Sensing System for Musculoskeletal Disorders Prevention in Construction
by Junqi Zhao, Esther Obonyo and Sven G. Bilén
Sensors 2021, 21(4), 1324; https://doi.org/10.3390/s21041324 - 13 Feb 2021
Cited by 30 | Viewed by 5181
Abstract
Construction workers executing manual-intensive tasks are susceptible to musculoskeletal disorders (MSDs) due to overexposure to awkward postures. Automated posture recognition and assessment based on wearable sensor output can help reduce MSDs risks through early risk-factor detection. However, extant studies mainly focus on optimizing [...] Read more.
Construction workers executing manual-intensive tasks are susceptible to musculoskeletal disorders (MSDs) due to overexposure to awkward postures. Automated posture recognition and assessment based on wearable sensor output can help reduce MSDs risks through early risk-factor detection. However, extant studies mainly focus on optimizing recognition models. There is a lack of studies exploring the design of a wearable sensing system that assesses the MSDs risks based on detected postures and then provides feedback for injury prevention. This study aims at investigating the design of an effective wearable MSDs prevention system. This study first proposes the design of a wearable inertial measurement unit (IMU) sensing system, then develops the prototype for end-user evaluation. Construction workers and managers evaluated a proposed system by interacting with wearable sensors and user interfaces (UIs), followed by an evaluation survey. The results suggest that wearable sensing is a promising approach for collecting motion data with low discomfort; posture-based MSDs risk assessment has a high potential in improving workers’ safety awareness; and mobile- and cloud-based UIs can deliver the risk assessment information to end-users with ease. This research contributes to the design, development, and validation of wearable sensing-based injury prevention systems, which may be adapted to other labor-intensive occupations. Full article
(This article belongs to the Special Issue Smart Sensors: Applications and Advances in Human Motion Analysis)
Show Figures

Figure 1

Figure 1
<p>MetaMotion C sensor and interfaces [<a href="#B5-sensors-21-01324" class="html-bibr">5</a>]: (<b>a</b>) MetaMotion C IMU sensor board with the positive direction in each axis, (<b>b</b>) sensor board and battery, (<b>c</b>,<b>d</b>) are the MetaBase interfaces for configuring sensors on a smartphone.</p>
Full article ">Figure 2
<p>Sensor placement (head, right arm, chest, right thigh, and calf) in lab test [<a href="#B5-sensors-21-01324" class="html-bibr">5</a>].</p>
Full article ">Figure 3
<p>Subjects working with sensors (the sensors blocked are not circled) [<a href="#B17-sensors-21-01324" class="html-bibr">17</a>].</p>
Full article ">Figure 4
<p>CLN conceptual architecture integrating one-layer CNN and one-layer LSTM. The detailed model parameter setup and evaluation is discussed in the authors’ previous work [<a href="#B17-sensors-21-01324" class="html-bibr">17</a>].</p>
Full article ">Figure 5
<p>Posture recognition model output.</p>
Full article ">Figure 6
<p>Mobile application UI, which includes: (<b>a</b>) login interface, (<b>b</b>) sensor placement instructions, (<b>c</b>) 30-min assessment feedback, and (<b>d</b>) daily assessment feedback.</p>
Full article ">Figure 7
<p>System architecture of the MSD assessment information delivery system.</p>
Full article ">Figure 8
<p>Implementation procedure for evaluation survey.</p>
Full article ">Figure 9
<p>Description of survey subjects.</p>
Full article ">Figure 10
<p>Confusion matrix of posture recognition. The numbers were normalized for better interpretation.</p>
Full article ">Figure 11
<p>Ranking of feature importance.</p>
Full article ">Figure A1
<p>Cloud-based dashboard interface.</p>
Full article ">
19 pages, 9027 KiB  
Article
Evaluation of Optical and Radar Based Motion Capturing Technologies for Characterizing Hand Movement in Rheumatoid Arthritis—A Pilot Study
by Uday Phutane, Anna-Maria Liphardt, Johanna Bräunig, Johann Penner, Michael Klebl, Koray Tascilar, Martin Vossiek, Arnd Kleyer, Georg Schett and Sigrid Leyendecker
Sensors 2021, 21(4), 1208; https://doi.org/10.3390/s21041208 - 9 Feb 2021
Cited by 7 | Viewed by 5142
Abstract
In light of the state-of-the-art treatment options for patients with rheumatoid arthritis (RA), a detailed and early quantification and detection of impaired hand function is desirable to allow personalized treatment regiments and amend currently used subjective patient reported outcome measures. This is the [...] Read more.
In light of the state-of-the-art treatment options for patients with rheumatoid arthritis (RA), a detailed and early quantification and detection of impaired hand function is desirable to allow personalized treatment regiments and amend currently used subjective patient reported outcome measures. This is the motivation to apply and adapt modern measurement technologies to quantify, assess and analyze human hand movement using a marker-based optoelectronic measurement system (OMS), which has been widely used to measure human motion. We complement these recordings with data from markerless (Doppler radar) sensors and data from both sensor technologies are integrated with clinical outcomes of hand function. The technologies are leveraged to identify hand movement characteristics in RA affected patients in comparison to healthy control subjects, while performing functional tests, such as the Moberg-Picking-Up Test. The results presented discuss the experimental framework and present the limiting factors imposed by the use of marker-based measurements on hand function. The comparison of simple finger motion data, collected by the OMS, to data recorded by a simple continuous wave radar suggests that radar is a promising option for the objective assessment of hand function. Overall, the broad scope of integrating two measurement technologies with traditional clinical tests shows promising potential for developing new pathways in understanding of the role of functional outcomes for the RA pathology. Full article
(This article belongs to the Special Issue Smart Sensors: Applications and Advances in Human Motion Analysis)
Show Figures

Figure 1

Figure 1
<p>Marker setup showing all 29 markers on the hand in (<b>a</b>) with the hand in open position and displaying the marker labels and in (<b>b</b>) with the hand in the first flat reference posture and in (<b>c</b>) with the hand in the second reference posture displaying thumb marker labeling.</p>
Full article ">Figure 2
<p>Radar measurement setup: 1—CW radar, 2—absorber wall, 3—absorber mat on table, 4—optical marker placed on fingertip to collect reference data.</p>
Full article ">Figure 3
<p>The hand postures for the different recordings, listed in <a href="#sensors-21-01208-t001" class="html-table">Table 1</a>, with the respective marker set-up. In (<b>a</b>), the 25 marker set, along with the surface EMG sensors, as described in <a href="#sec2dot1dot2-sensors-21-01208" class="html-sec">Section 2.1.2</a>, for the MPUT. The participants are instructed to lift and place 12 objects in the nearby container. In (<b>b</b>), finger tipping motion is shown between the thumb and the index finger with a 29 marker set. In (<b>c</b>), the fist posture showing the full flexion capacity of the hand is demonstrated.</p>
Full article ">Figure 4
<p>Hyper-extension of the index finger to perform tapping movement (sequence 1-2-3-2-1).</p>
Full article ">Figure 5
<p>The mean and standard deviation values for the grip strength in lbs, on the left MPUT times, on the right, for all subjects, RA and control (CON) groups. Grip strength was assessed according to the clinical set up and repeated in a sitting position with markers placed on the hand (OMS setup).</p>
Full article ">Figure 6
<p>The normalized grip distances between thumb tip (T5) and index finger DIP (I4) markers are shown for a participant each from the control and RA groups, while performing the MPUT. The shaded and non-shaded areas in each plot correspond to the manipulation and prehension motions, respectively.</p>
Full article ">Figure 7
<p>(<b>a</b>) Short-time Fourier transform of an example measurement with amplitude value 10 dB; (<b>b</b>) extracted fingertip speed of radar measurement shown in (<b>a</b>) and reference data collected by OMS.</p>
Full article ">Figure 8
<p>(<b>a</b>) Short-time Fourier transform of an example measurement with strong amplitude fluctuations and a selected amplitude value 20 dB; (<b>b</b>) extracted fingertip speed of radar measurement shown in (<b>a</b>) and reference data collected by OMS.</p>
Full article ">
14 pages, 3078 KiB  
Article
Quantifying Coordination and Variability in the Lower Extremities after Anterior Cruciate Ligament Reconstruction
by Sangheon Park and Sukhoon Yoon
Sensors 2021, 21(2), 652; https://doi.org/10.3390/s21020652 - 19 Jan 2021
Cited by 4 | Viewed by 2719
Abstract
Patients experience various biomechanical changes following reconstruction for anterior cruciate ligament (ACL) injury. However, previous studies have focused on lower extremity joints as a single joint rather than simultaneous lower extremity movements. Therefore, this study aimed to determine the movement changes in the [...] Read more.
Patients experience various biomechanical changes following reconstruction for anterior cruciate ligament (ACL) injury. However, previous studies have focused on lower extremity joints as a single joint rather than simultaneous lower extremity movements. Therefore, this study aimed to determine the movement changes in the lower limb coordination patterns according to movement type following ACL reconstruction. Twenty-one post ACL reconstruction patients (AG) and an equal number of healthy adults (CG) participated in this study. They were asked to perform walking, running, and cutting maneuvers. The continuous relative phase and variability were calculated to examine the coordination pattern. During running and cutting at 30 and 60°, the AG demonstrated a lower in-phase hip–knee coordination pattern in the sagittal plane. The AG demonstrated low hip–knee variability in the sagittal plane during cutting at 60°. The low in-phase coordination pattern can burden the knee by generating unnatural movements following muscle contraction in the opposite direction. Based on the results, it would be useful to identify the problem and provide the fundamental evidence for the optimal timing of return-to-sport after ACL reconstruction (ACLR) rehabilitation, if the coordination variable is measured with various sensors promptly in the sports field to evaluate the coordination of human movement. Full article
(This article belongs to the Special Issue Smart Sensors: Applications and Advances in Human Motion Analysis)
Show Figures

Figure 1

Figure 1
<p>Attachment of markers on lower extremity (the axis of angular displacement and velocity were defined as <a href="#sensors-21-00652-f002" class="html-fig">Figure 2</a>, red line = positive <span class="html-italic">x</span>-axis vector, green line = positive <span class="html-italic">y</span>-axis vector, blue line = positive <span class="html-italic">z</span>-axis vector, white circle = attachment of marker).</p>
Full article ">Figure 2
<p>Three-dimensional capture area by Qualisys track manager (global coordination system, red line = <span class="html-italic">x</span>-axis, green line = <span class="html-italic">y</span>-axis, blue line = <span class="html-italic">z</span>-axis, shaded area = camera field-of-view of 3D cones).</p>
Full article ">Figure 3
<p>Procedure of continuous relative phase (CRP) data processing (the 1st row = each joint angular displacement and velocity on polar coordination system, the 2nd row = each joint phase angles, the 3rd row = CRP angle).</p>
Full article ">Figure 4
<p>Ensemble average of CRP between hip–knee joint according to difficulty of movement.</p>
Full article ">Figure 5
<p>Ensemble average of CRP between knee–ankle joint according to difficulty of movement.</p>
Full article ">
20 pages, 4170 KiB  
Article
Interface Pressure System to Compare the Functional Performance of Prosthetic Sockets during the Gait in People with Trans-Tibial Amputation
by Salvador Ibarra Aguila, Gisel J. Sánchez, Eric E. Sauvain, B. Alemon, Rita Q. Fuentes-Aguilar and Joel C. Huegel
Sensors 2020, 20(24), 7043; https://doi.org/10.3390/s20247043 - 9 Dec 2020
Cited by 15 | Viewed by 5288
Abstract
The interface pressure between the residual limb and prosthetic socket has a significant effect on the amputee’s mobility and level of comfort with their prosthesis. This paper presents a socket interface pressure (SIFP) system to compare the interface pressure differences during gait between [...] Read more.
The interface pressure between the residual limb and prosthetic socket has a significant effect on the amputee’s mobility and level of comfort with their prosthesis. This paper presents a socket interface pressure (SIFP) system to compare the interface pressure differences during gait between two different types of prosthetic sockets for a transtibial amputee. The system evaluates the interface pressure in six critical regions of interest (CROI) of the lower limb amputee and identifies the peak pressures during certain moments of the gait cycle. The six sensors were attached to the residual limb in the CROIs before the participant with transtibial amputation donned a prosthetic socket. The interface pressure was monitored and recorded while the participant walked on a treadmill for 10 min at 1.4 m/s. The results show peak pressure differences of almost 0.22 kgf/cm2 between the sockets. It was observed that the peak pressure occurred at 50% of the stance phase of the gait cycle. This SIFP system may be used by prosthetists, physical therapists, amputation care centers, and researchers, as well as government and private regulators requiring comparison and evaluation of prosthetic components, components under development, and testing. Full article
(This article belongs to the Special Issue Smart Sensors: Applications and Advances in Human Motion Analysis)
Show Figures

Figure 1

Figure 1
<p>Interface pressure (IP) between lower limb residuum and socket. (<b>a</b>) 3D printed socket being tested by an amputee in an aluminum support block. (<b>b</b>) Cutout 3D model showing the pressure exerted by the prosthetic socket over the interface area of the residual limb.</p>
Full article ">Figure 2
<p>The two main gait cycle phases with the key support and balance subphases of the residual left lower limb with a transtibial prosthesis.</p>
Full article ">Figure 3
<p>Amputation line for the transtibial amputee showing residual anatomy points of pressure over soft tissue. The bone protrusions include tibial and fibula epicondyle, head of the tibia, and distal amputation line for the tibia and fibula noted in red points (based on Reference [<a href="#B15-sensors-20-07043" class="html-bibr">15</a>]).</p>
Full article ">Figure 4
<p>Conceptual Diagram of the problem and proposed socket interface pressure (SIFP) system shown in blue dashed line. It includes both the SIFP device and the motion capture (MOCAP) sub-system.</p>
Full article ">Figure 5
<p>Resistive sensor FlexiForce A201<sup>®</sup> connection diagram, where node 1 refers to input voltage, and node 2 refers to ground and node 3 refers to output voltage.</p>
Full article ">Figure 6
<p>Schematic diagram of the socket interface pressure (SIFP) device.</p>
Full article ">Figure 7
<p>Characterization of SIFP via dynamometer applying various loads to a soft elastomeric material that has similar physical properties as a soft human tissue.</p>
Full article ">Figure 8
<p>Linearization of the six pressure sensors. The dotted lines represent the obtained voltages, and the solid lines represent each sensor’s linear regression.</p>
Full article ">Figure 9
<p>Three-dimensional CT image of the transtibial amputee participant.The measurement to localate the first critical regions of interest (CROI) is taken from the superior point of the patella to the middle point of the tibial crest.</p>
Full article ">Figure 10
<p>Motion capture system space setup design.</p>
Full article ">Figure 11
<p>Joint coordinate projection for subject detection before 90<math display="inline"><semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics></math> turn for lateral motion capture of Kinect V2 and V1 on (<span class="html-italic">x,y</span>) plane.</p>
Full article ">Figure 12
<p>Artificial residuum with the socket interface pressure (SIFP) sensor and F-Socket sensors before the socket is installed.</p>
Full article ">Figure 13
<p>Platform for the comparative test between SIFP device and F-Socket, applying up to 80-kg loads, in 20-kg increments, to the artificial residuum. The SIFP device and F-Socket sensors were installed between the artificial residuum and the socket.</p>
Full article ">Figure 14
<p>Linear Characterization of sensors showing measurement divergence. (<b>a</b>) Error points on front sensors A1, A2, and A3. (<b>b</b>) Error points on back sensors A4, A5, and A6.</p>
Full article ">Figure 15
<p>Graphical comparison of the normalized data between sensors.</p>
Full article ">Figure 16
<p>Left amputee gait cycle paired with lateral position of left knee and prosthetic foot from the second 167 to 172.</p>
Full article ">Figure 17
<p>Pressure on the residuum and prosthetic leg trajectory for SD1.</p>
Full article ">Figure 18
<p>Location of sensors in the CROIs on the left residual limb.</p>
Full article ">Figure 19
<p>Pressure generated during the second minute of the gait for SD2.</p>
Full article ">Figure 20
<p>Higher pressures generated during second minute of the gait for SD2.</p>
Full article ">
29 pages, 1027 KiB  
Article
Body-Worn IMU Human Skeletal Pose Estimation Using a Factor Graph-Based Optimization Framework
by Timothy McGrath and Leia Stirling
Sensors 2020, 20(23), 6887; https://doi.org/10.3390/s20236887 - 2 Dec 2020
Cited by 34 | Viewed by 7127
Abstract
Traditionally, inertial measurement units- (IMU) based human joint angle estimation requires a priori knowledge about sensor alignment or specific calibration motions. Furthermore, magnetometer measurements can become unreliable indoors. Without magnetometers, however, IMUs lack a heading reference, which leads to unobservability issues. This paper [...] Read more.
Traditionally, inertial measurement units- (IMU) based human joint angle estimation requires a priori knowledge about sensor alignment or specific calibration motions. Furthermore, magnetometer measurements can become unreliable indoors. Without magnetometers, however, IMUs lack a heading reference, which leads to unobservability issues. This paper proposes a magnetometer-free estimation method, which provides desirable observability qualities under joint kinematics that sufficiently excite the lower body degrees of freedom. The proposed lower body model expands on the current self-calibrating human-IMU estimation literature and demonstrates a novel knee hinge model, the inclusion of segment length anthropometry, segment cross-leg length discrepancy, and the relationship between the knee axis and femur/tibia segment. The maximum a posteriori problem is formulated as a factor graph and inference is performed via post-hoc, on-manifold global optimization. The method is evaluated (N = 12) for a prescribed human motion profile task. Accuracy of derived knee flexion/extension angle (4.34? root mean square error (RMSE)) without magnetometers is similar to current state-of-the-art with magnetometer use. The developed framework can be expanded for modeling additional joints and constraints. Full article
(This article belongs to the Special Issue Smart Sensors: Applications and Advances in Human Motion Analysis)
Show Figures

Figure 1

Figure 1
<p>The human-IMU kinematic system, with the subject mid-stride. Image is an excerpt frame from a 3D animation using the proposed method. The subject’s left leg is labeled with coordinate systems of the IMUs (bold RGB triplets with black text label) and anatomical segments (thin RGB triplets with gray text label), the static vectors from the IMUs to neighboring joint centers (red and green), and the knee’s hinge axis (dotted brown) with their static representations in the thigh and shank IMU frames (solid brown). Notation of variables is detailed in <a href="#sec2dot1-sensors-20-06887" class="html-sec">Section 2.1</a>.</p>
Full article ">Figure 2
<p>The set of relative angular velocity vectors <math display="inline"><semantics> <msub> <mover accent="true"> <mi>m</mi> <mo stretchy="false">→</mo> </mover> <mi>k</mi> </msub> </semantics></math> projected onto knee axis <math display="inline"><semantics> <mover accent="true"> <mi>r</mi> <mo stretchy="false">→</mo> </mover> </semantics></math> with resulting residuals <math display="inline"><semantics> <msub> <mi>e</mi> <mi>k</mi> </msub> </semantics></math>. Note that for a perfect hinge <math display="inline"><semantics> <mrow> <msub> <mi>m</mi> <mi>k</mi> </msub> <mrow> <mo>‖</mo> </mrow> <mover accent="true"> <mi>r</mi> <mo stretchy="false">→</mo> </mover> <mspace width="0.166667em"/> <mo>∀</mo> <mspace width="0.166667em"/> <mi>k</mi> </mrow> </semantics></math> however, due to imperfect hinge kinematics of the human knee and soft tissue perturbation of the gyroscopes mounted to the skin of the leg, the set of vectors <math display="inline"><semantics> <msub> <mover accent="true"> <mi>m</mi> <mo stretchy="false">→</mo> </mover> <mi>k</mi> </msub> </semantics></math> takes this characteristic double cone shape.</p>
Full article ">Figure 3
<p>Factor graph representation of the problem for consecutive keyframes <span class="html-italic">i</span> and <span class="html-italic">j</span>. Variables are represented as circles, whereas the connecting factors are represented as solid squares. For readability, factors and their connecting lines to variables are colored according to factor type: black is the IMU dynamics factor, teal is the angular velocity model, pink is anthropometry, blue is knee hinge kinematics, red is the constrained joint center between IMUs model, violet is the knee axis to segment length quasi-orthogonality factor, and orange is the segment length discrepancy factor. All variable notation is defined in <a href="#sec2dot1-sensors-20-06887" class="html-sec">Section 2.1</a>.</p>
Full article ">Figure 4
<p>(<b>Left</b>) Placement of the reflective markers (black circles) and IMUs (green squares) on the subject. IMUs on the thigh and shank were not placed precisely, and location varied both vertically and in the transverse plane. (<b>Right</b>) A blown-up illustration of the marker triads with three markers affixed and IMU. Coordinate system of the IMU was known a priori, and the comparison reference coordinate system of the marker triad was constructed to match.</p>
Full article ">Figure 5
<p>Conceptual process methodology to compute IMU-derived joint angles for the human motion profile task. Levenberg–Marquardt is used as an iterative solver to the proposed optimization problem.</p>
Full article ">
19 pages, 2316 KiB  
Article
Wearable Biofeedback Improves Human-Robot Compliance during Ankle-Foot Exoskeleton-Assisted Gait Training: A Pre-Post Controlled Study in Healthy Participants
by Cristiana Pinheiro, Joana Figueiredo, Nuno Magalhães and Cristina P. Santos
Sensors 2020, 20(20), 5876; https://doi.org/10.3390/s20205876 - 17 Oct 2020
Cited by 12 | Viewed by 3825
Abstract
The adjunctive use of biofeedback systems with exoskeletons may accelerate post-stroke gait rehabilitation. Wearable patient-oriented human-robot interaction-based biofeedback is proposed to improve patient-exoskeleton compliance regarding the interaction torque’s direction (joint motion strategy) and magnitude (user participation strategy) through auditory and vibrotactile cues during [...] Read more.
The adjunctive use of biofeedback systems with exoskeletons may accelerate post-stroke gait rehabilitation. Wearable patient-oriented human-robot interaction-based biofeedback is proposed to improve patient-exoskeleton compliance regarding the interaction torque’s direction (joint motion strategy) and magnitude (user participation strategy) through auditory and vibrotactile cues during assisted gait training, respectively. Parallel physiotherapist-oriented strategies are also proposed such that physiotherapists can follow in real-time a patient’s motor performance towards effective involvement during training. A preliminary pre-post controlled study was conducted with eight healthy participants to conclude about the biofeedback’s efficacy during gait training driven by an ankle-foot exoskeleton and guided by a technical person. For the study group, performance related to the interaction torque’s direction increased during (p-value = 0.07) and after (p-value = 0.07) joint motion training. Further, the performance regarding the interaction torque’s magnitude significantly increased during (p-value = 0.03) and after (p-value = 68.59 × 10−3) user participation training. The experimental group and a technical person reported promising usability of the biofeedback and highlighted the importance of the timely cues from physiotherapist-oriented strategies. Less significant improvements in patient–exoskeleton compliance were observed in the control group. The overall findings suggest that the proposed biofeedback was able to improve the participant-exoskeleton compliance by enhancing human-robot interaction; thus, it may be a powerful tool to accelerate post-stroke ankle-foot deformity recovery. Full article
(This article belongs to the Special Issue Smart Sensors: Applications and Advances in Human Motion Analysis)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Diagram of the ankle–foot exoskeleton’s trajectory tracking control, where <span class="html-italic">Ref</span> is the reference joint angle (segmented in four gait phases GP1-4), <span class="html-italic">Int</span> is human-robot interaction torque, <span class="html-italic">M</span> is the actuator, <span class="html-italic">P</span> is the potentiometer, and <span class="html-italic">S</span> is the strain gauge.</p>
Full article ">Figure 2
<p>(<b>a</b>) Vibrotactile shank bands and (<b>b</b>) BS components and electronic interfaces when BS is used integrated into or as a module of the ankle-foot exoskeleton.</p>
Full article ">Figure 2 Cont.
<p>(<b>a</b>) Vibrotactile shank bands and (<b>b</b>) BS components and electronic interfaces when BS is used integrated into or as a module of the ankle-foot exoskeleton.</p>
Full article ">Figure 3
<p>Pictures of participant 1 and technical person during joint motion and user participation biofeedback training with indication of the used cue (visual and auditory or vibrotactile) and its meaning (HRI torque’s direction or magnitude).</p>
Full article ">Figure 4
<p>Mean and standard deviation per procedure (PRTR, TR, and PSTR) of (<b>a</b>,<b>b</b>) maximum and minimum human-robot interaction torque; (<b>c</b>,<b>d</b>) the RMS of the interaction torque for the training gait phase (TGP) and overall gait cycle (GC); <span class="html-italic">Performance Dir, Mag, Mag Thr</span> for (<b>e</b>,<b>f</b>) TGP and (<b>g</b>,<b>h</b>) GC; (<b>i</b>,<b>j</b>) the RMSE for TGP and GC; and (<b>k</b>,<b>l</b>) the delay between the reference joint angle (<span class="html-italic">Ref</span>) and real joint angle for the control/experimental group regarding the joint motion biofeedback strategy.</p>
Full article ">Figure 5
<p>Mean and standard deviation per procedure (PRTR, TR, and PSTR) of (<b>a</b>,<b>b</b>) the maximum and minimum human-robot interaction torque; (<b>c</b>,<b>d</b>) the RMS of the interaction torque for the training gait phase (TGP) and overall gait cycle (GC); <span class="html-italic">Performance Dir, Mag, Mag Thr</span> for (<b>e</b>,<b>f</b>) TGP and (<b>g</b>,<b>h</b>) GC; (<b>i</b>,<b>j</b>) the RMSE for TGP and GC; and (<b>k</b>,<b>l</b>) the delay between the reference joint angle (<span class="html-italic">Ref</span>) and real joint angle for the control/experimental group regarding the user participation biofeedback strategy. The symbol “*” means a statistically significant result of the statistical test between PRTR and TR/PSTR.</p>
Full article ">
14 pages, 1736 KiB  
Article
Estimating Lower Extremity Running Gait Kinematics with a Single Accelerometer: A Deep Learning Approach
by Mohsen Gholami, Christopher Napier and Carlo Menon
Sensors 2020, 20(10), 2939; https://doi.org/10.3390/s20102939 - 22 May 2020
Cited by 59 | Viewed by 7701
Abstract
Abnormal running kinematics are associated with an increased incidence of lower extremity injuries among runners. Accurate and unobtrusive running kinematic measurement plays an important role in the detection of gait abnormalities and the prevention of injuries among runners. Inertial-based methods have been proposed [...] Read more.
Abnormal running kinematics are associated with an increased incidence of lower extremity injuries among runners. Accurate and unobtrusive running kinematic measurement plays an important role in the detection of gait abnormalities and the prevention of injuries among runners. Inertial-based methods have been proposed to address this need. However, previous methods require cumbersome sensor setup or participant-specific calibration. This study aims to validate a shoe-mounted accelerometer for sagittal plane lower extremity angle measurement during running based on a deep learning approach. A convolutional neural network (CNN) architecture was selected as the regression model to generalize in inter-participant scenarios and to minimize poorly estimated joints. Motion and accelerometer data were recorded from ten participants while running on a treadmill at five different speeds. The reference joint angles were measured by an optical motion capture system. The CNN model predictions deviated from the reference angles with a root mean squared error (RMSE) of less than 3.5° and 6.5° in intra- and inter-participant scenarios, respectively. Moreover, we provide an estimation of six important gait events with a mean absolute error of less than 2.5° and 6.5° in intra- and inter-participants scenarios, respectively. This study highlights an appealing minimal sensor setup approach for gait analysis purposes. Full article
(This article belongs to the Special Issue Smart Sensors: Applications and Advances in Human Motion Analysis)
Show Figures

Figure 1

Figure 1
<p>(<b>A</b>) Experimental setup including six motion capture cameras and a split-belt treadmill. The schematic of angles estimated using the raw signal of a foot-mounted accelerometer. (<b>B</b>) Reflective marker positions on the lower extremity.</p>
Full article ">Figure 2
<p>Sample of raw accelerometer signal.</p>
Full article ">Figure 3
<p>Splitting data for training and testing for (<b>A</b>) inter-participant method and (<b>B</b>) intra-participant method.</p>
Full article ">Figure 4
<p>Gait events over a sample participant’s gait cycle. Flexion (Flex)/Dorsiflexion (DF) are positive; extension (Ext)/Plantarflexion (PF) are negative. IC, initial contact.</p>
Full article ">Figure 5
<p>Average estimated and reference angles of participant 7 with standard deviations in shadow for (<b>A</b>) intra-participant model and (<b>B</b>) inter-participant model. Flexion (Flex)/Dorsiflexion (DF) are positive; extension (Ext)/Plantarflexion (PF) are negative.</p>
Full article ">
12 pages, 1485 KiB  
Article
Gait Characteristics under Imposed Challenge Speed Conditions in Patients with Parkinson’s Disease During Overground Walking
by Myeounggon Lee, Changhong Youm, Byungjoo Noh, Hwayoung Park and Sang-Myung Cheon
Sensors 2020, 20(7), 2132; https://doi.org/10.3390/s20072132 - 10 Apr 2020
Cited by 14 | Viewed by 3517
Abstract
Evaluating gait stability at slower or faster speeds and self-preferred speeds based on continuous steps may assist in determining the severity of motor symptoms in Parkinson’s disease (PD) patients. This study aimed to investigate the gait ability at imposed speed conditions in PD [...] Read more.
Evaluating gait stability at slower or faster speeds and self-preferred speeds based on continuous steps may assist in determining the severity of motor symptoms in Parkinson’s disease (PD) patients. This study aimed to investigate the gait ability at imposed speed conditions in PD patients during overground walking. Overall, 74 PD patients and 52 age-matched healthy controls were recruited. Levodopa was administered to patients in the PD group, and all participants completed imposed slower, preferred, and faster speed walking tests along a straight 15-m walkway wearing shoe-type inertial measurement units. Reliability of the slower and faster conditions between the estimated and measured speeds indicated excellent agreement for PD patients and controls. PD patients demonstrated higher gait asymmetry (GA) and coefficient of variance (CV) for stride length and stance phase than the controls at slower speeds and higher CVs for phases for single support, double support, and stance. CV of the double support phase could distinguish between PD patients and controls at faster speeds. The GA and CVs of stride length and phase-related variables were associated with motor symptoms in PD patients. Speed conditions should be considered during gait analysis. Gait variability could evaluate the severity of motor symptoms in PD patients. Full article
(This article belongs to the Special Issue Smart Sensors: Applications and Advances in Human Motion Analysis)
Show Figures

Figure 1

Figure 1
<p>Recruitment process flowchart.</p>
Full article ">Figure 2
<p>Schematic of the data collection and analysis phase under steady-state conditions. (<b>a</b>) Data collection and analysis phase; the blue arrows indicate acceleration to steady-state and deceleration steps after measurements are completed. (<b>b</b>) Detection of gait events with the shoe-type inertial measurement unit (IMU) system. Data is collected at 100 Hz. HS, heel strike; TO, toe-off.</p>
Full article ">Figure 3
<p>Bland-Altman plots for data agreement between the estimated and measured overground walking speeds. (<b>a</b>) and (<b>b</b>) are the slower and faster speed results for PD patients; (<b>c</b>) and (<b>d</b>) are the slower and faster speed results for control patients. PD, Parkinson’s disease; LOA, limits of agreement.</p>
Full article ">
14 pages, 1317 KiB  
Article
Gait Characteristics Based on Shoe-Type Inertial Measurement Units in Healthy Young Adults during Treadmill Walking
by Myeounggon Lee, Changhong Youm, Byungjoo Noh and Hwayoung Park
Sensors 2020, 20(7), 2095; https://doi.org/10.3390/s20072095 - 8 Apr 2020
Cited by 14 | Viewed by 3331
Abstract
This study investigated the gait characteristics of healthy young adults using shoe-type inertial measurement units (IMU) during treadmill walking. A total of 1478 participants were tested. Principal component analyses (PCA) were conducted to determine which principal components (PCs) best defined the [...] Read more.
This study investigated the gait characteristics of healthy young adults using shoe-type inertial measurement units (IMU) during treadmill walking. A total of 1478 participants were tested. Principal component analyses (PCA) were conducted to determine which principal components (PCs) best defined the characteristics of healthy young adults. A non-hierarchical cluster analysis was conducted to evaluate the essential gait ability, according to the results of the PC1 score. One-way repeated analysis of variance with the Bonferroni correction was used to compare gait performances in the cluster groups. PCA outcomes indicated 76.9% variance for PC1–PC6, where PC1 (gait variability (GV): 18.5%), PC2 (pace: 17.8%), PC3 (rhythm and phase: 13.9%), and PC4 (bilateral coordination: 11.2%) were the gait-related factors. All of the pace, rhythm, GV, and variables for bilateral coordination classified the gait ability in the cluster groups. We suggest that the treadmill walking task may be reliable to evaluate the gait performances, which may provide insight into understanding the decline of gait ability. The presented results are considered meaningful for understanding the gait patterns of healthy adults and may prove useful as reference outcomes for future gait analyses. Full article
(This article belongs to the Special Issue Smart Sensors: Applications and Advances in Human Motion Analysis)
Show Figures

Figure 1

Figure 1
<p>Shoe-type inertial measurement units (IMU) system and detection of gait events (HS is the Heel strike and TO is a Toe off).</p>
Full article ">Figure 2
<p>Principle component analysis (PCA) outcomes. (<b>a</b>) Cumulative percentage of the total variance (76.9%) explained by the principal components (PC<sub>s</sub>). (<b>b</b>) Scree plot of the 28 principal components.</p>
Full article ">Figure 3
<p>Results of pace, rhythm, phases, gait variability (GV), and bilateral coordination according to the cluster groups. The cluster groups are divided based on the stride length and stance phase. * denotes a significant difference. <span class="html-italic">P</span> &lt; 0.0167.</p>
Full article ">
21 pages, 3728 KiB  
Article
Research on a Pedestrian Crossing Intention Recognition Model Based on Natural Observation Data
by Hongjia Zhang, Yanjuan Liu, Chang Wang, Rui Fu, Qinyu Sun and Zhen Li
Sensors 2020, 20(6), 1776; https://doi.org/10.3390/s20061776 - 23 Mar 2020
Cited by 27 | Viewed by 5589
Abstract
Accurate identification of pedestrian crossing intention is of great significance to the safe and efficient driving of future fully automated vehicles in the city. This paper focuses on pedestrian intention recognition on the basis of pedestrian detection and tracking. A large number of [...] Read more.
Accurate identification of pedestrian crossing intention is of great significance to the safe and efficient driving of future fully automated vehicles in the city. This paper focuses on pedestrian intention recognition on the basis of pedestrian detection and tracking. A large number of natural crossing sequence data of pedestrians and vehicles are first collected by a laser scanner and HD camera, then 1980 effective crossing samples of pedestrians are selected. Influencing parameter sets of pedestrian crossing intention are then obtained through statistical analysis. Finally, long short-term memory network with attention mechanism (AT-LSTM) model is proposed. Compared with the support vector machine (SVM) model, results show that when the pedestrian crossing intention is recognized 0 s prior to crossing, the recognition accuracy of the AT-LSTM model for pedestrian crossing intention is 96.15%, which is 6.07% higher than that of SVM model; when the pedestrian crossing intention is recognized 0.6 s prior, the recognition accuracy of AT-LSTM model is 90.68%, which is 4.85% higher than that of the SVM model. The determination of pedestrian crossing intention parameter set and the more accurate recognition of pedestrian intention provided in this work provide a foundation for future fully automated driving vehicles. Full article
(This article belongs to the Special Issue Smart Sensors: Applications and Advances in Human Motion Analysis)
Show Figures

Figure 1

Figure 1
<p>Pedestrian crossing intention recognition framework.</p>
Full article ">Figure 2
<p>LSTM framework for integrating attention mechanism.</p>
Full article ">Figure 3
<p>Photographs of the experimental section.</p>
Full article ">Figure 3 Cont.
<p>Photographs of the experimental section.</p>
Full article ">Figure 4
<p>Parameter acquisition equipment.</p>
Full article ">Figure 5
<p>Time series of pedestrian crossing.</p>
Full article ">Figure 6
<p>Distance between vehicles and zebra crossings under different crossing intentions. (<b>a</b>) The distance between vehicles and zebra crossings at different times and with different intentions. (<b>b</b>) Mean distance between vehicles and zebra at crossings under different intentions.</p>
Full article ">Figure 7
<p>Vehicle speed diagram under different crossing intentions. (<b>a</b>) Vehicle speed at different times with different intentions. (<b>b</b>) Mean vehicle speed under different intentions.</p>
Full article ">Figure 8
<p>TTC diagram under different crossing intentions. (<b>a</b>) TTC at different times and with different intentions (<b>b</b>) TTC under different intentions.</p>
Full article ">Figure 9
<p>Pedestrian speed diagram under different crossing intentions. (<b>a</b>) Pedestrian speed at different times and with different intentions. (<b>b</b>) Mean pedestrian speed under different intentions.</p>
Full article ">Figure 10
<p>Distance between pedestrians and zebra crossings under different crossing intentions. (<b>a</b>) Distance between pedestrians and zebra crossings at different times and with different intentions. (<b>b</b>) Mean distance between pedestrians and zebra crossings under different crossing intentions.</p>
Full article ">Figure 11
<p>ROC curve of model identification 0 s in advance.</p>
Full article ">Figure 12
<p>Confusion matrix for model identification 0 s in advance. (<b>a</b>) AT-LSTM model. (<b>b</b>) SVM model.</p>
Full article ">Figure 13
<p>ROC curve of model identification 0.6 s in advance.</p>
Full article ">Figure 14
<p>Confusion matrix for model identification 0.6 s in advance. (<b>a</b>) AT-LSTM model. (<b>b</b>) SVM model.</p>
Full article ">
Back to TopTop