PhD Student in Robotics
PhD Student in Robotics

Publications

Robotic exoskeletons for the hand have the potential to enhance physical rehabilitation and augment functional recovery following neurological injury. However, before such devices can be deployed to the home and clinic, robust experimental validation of performance capabilities is required. Currently, researchers rely on human subjects for these validations, but this method makes the separation of device and active wearer contributions difficult to determine. To address the need for a robust mechanical analog of the human hand in exoskeleton validation, we have produced a low-cost, open-source, instrumented hand. This design features variable stiffness joints with position sensing, anthropomorphic palm and finger phalanges with human-like joint couplings, thumb origin location, and kinematic structure. Importantly, the novel joint design enables human-like double exponential finger joint stiffness. It improves on the state of the art, which comprises linear joint stiffness profiles, non-anthropomorphic size and shape, and inaccurate thumb kinematics. In this paper, we detail the design of this device and validate its performance for use in the design and validation of wearable systems.

The viscoelastic properties of human soft tissue influence the nature of interaction at attachment points in wearable devices. Characterizing these properties is especially critical for understanding physical human-robot interaction (pHRI) for exoskeleton design, prosthetics, and similar fields. This paper presents the design and control of a novel actuated indenter for the measurement of human soft tissue properties. The accuracy of position (0.025 mm) and force (111 mN) measurements allows for repeatable and controlled tissue deformation and monitoring at 36 different angular locations (10° increments). Controlled indentation displacement with force feedback is used in three types of applications on a human subject’s forearm. The soft tissue stiffness profile, viscoelastic relaxation behavior, and frequency domain response are measured. The ability to perform multiple characterizations with one indenter device broadens the scope of a human-centered approach to design, modeling, and perception as they relate to pHRI interfaces.

Physical human-robot interaction (pHRI) interfaces are responsible for ensuring safe, comfortable, and effective force transfer between wearable devices and their users. However, analysis is often oversimplified by treating the human-robot attachment as a rigid connection and using gross load measurements. As a result, information about the distribution of forces across the human-robot contact surface is lost. In this paper, we present an analysis method to predict distributed loading across a pHRI interface based on a model with discretized elastic elements that account for compliance from human soft tissue and the robot attachment. Stiffness properties of a proxy upper arm are measured with an indenter and used in the pHRI interface model. The analysis is performed assuming a rigid arm model, consistent with the underlying assumption in literature, and repeated using the proposed compliant arm model with measured elastic properties. The distributed loading predicted by the pHRI interface model is validated with measurements from a sensorized upper arm cuff on the Harmony exoskeleton. Our results reveal that a model incorporating compliance at the human-robot attachment is necessary to improve prediction of distributed interface loads. This motivates the need for human-centered analysis which can enable finer control of interaction forces and help design more ergonomic attachment interfaces.

EMG-based intention recognition and assistive device control are often developed separately, which can lead to the unintended consequence of requiring excessive muscular effort and fatigue during operation. In this paper, we address two important aspects of the performance of an integrated EMG-based assistive system. Firstly, we investigate the effects of muscular effort on EMG-based classification and robot control. Secondly, we propose a robot control solution that reduces muscular effort required in assisted dynamic daily tasks compared to the state-of-the-art control methods.

We have developed a one-of-a-kind hand exoskeleton, called Maestro, which can power finger movements of those surviving severe disabilities to complete daily tasks using compliant joints. In this paper, we present results from an electromyography (EMG) control strategy conducted with spinal cord injury (SCI) patients (C5, C6, and C7) in which the subjects completed daily tasks controlling Maestro with EMG signals from their forearm muscles. With its compliant actuation and its degrees of freedom that match the natural finger movements, Maestro is capable of helping the subjects grasp and manipulate a variety of daily objects (more than 15 from a standardized set). To generate control commands for Maestro, an artificial neural network algorithm was implemented along with a probabilistic control approach to classify and deliver four hand poses robustly with three EMG signals measured from the forearm and palm. Increase in the scores of a standardized test, called the Sollerman hand function test, and enhancement in different aspects of grasping such as strength shows feasibility that Maestro can be capable of improving the hand function of SCI subjects.

Measurement of interaction forces distributed across the attachment interface in wearable devices is critical for understanding ergonomic physical human–robot interaction (pHRI). The main challenges in sensorization of pHRI interfaces are (i) capturing the fine nature of force transmission from compliant human tissue onto rigid surfaces in the wearable device and (ii) utilizing a low-cost and easily implementable design that can be adapted for a variety of human interfaces. This paper addresses both challenges and presents a modular sensing panel that uses force-sensing resistors (FSRs) combined with robust electrical and mechanical integration principles that result in a reliable solution for distributed load measurement. The design is demonstrated through an upper-arm cuff, which uses 24 sensing panels, in conjunction with the Harmony exoskeleton. Validation of the design with controlled loading of the sensorized cuff proves the viability of FSRs in an interface sensing solution. Preliminary experiments with a human subject highlight the value of distributed interface force measurement in recognizing the factors that influence ergonomic pHRI and elucidating their effects. The modular design and low cost of the sensing panel lend themselves to extension of this approach for studying ergonomics in a variety of wearable applications with the goal of achieving safe, comfortable, and effective human–robot interaction.

Successful ergonomic human-robot collaboration depends on accurate prediction and control of forces transmitted at human-robot interfaces.  Unfortunately, such interaction force measurements are typically performed at a gross level (using joint torque sensors or single-point interface load cells) without focusing on distributed loading across the entire interface surface. Recognizing the need for a reliable and comprehensive sensing solution, in this paper, we present a sensorized upper arm cuff that uses Force Sensing Resistors (FSRs). Pilot experiments with the Harmony upper body exoskeleton are presented to validate the need for such a device and its design. The results show that simply measuring interaction forces through a single-point interface load cell is not sufficient to provide a complete picture of the distributed loads experienced by the user wearing the robot. We thus propose further experiments and extension of the work presented here to gain a better understanding of the human-robot interaction in wearable robots, with the goal of making it even more safe, ergonomic and efficient.

Despite mechanical advancements in assistive hand exoskeletons, the manipulation ability they provide has remained far inferior to that of a healthy human hand. State-of-the-art control strategies are mainly focused on robot joint-level position control, although accurate control of fingertip positions and forces is required for human-like dexterity. Due to nonlinear relationships between inputs and outputs, dexterous manipulation requires accurate models of interaction between the fingers, the exoskeleton, and the fingertip space. In this research, we utilize model-based control to achieve desired fingertip position and forces with a multi-degree-of-freedom (multi-DOF) exoskeleton for the first time. We compare it with conventional control methods and demonstrate the performance to be superior and within human accuracy levels.

In this paper, we address two of the most important challenges in development and control of assistive hand orthosis. First, supported by experimental results, we present a method to determine an optimal set of grasping poses, essential for grasping daily objects. Second, we present a method for determining the minimal number of surface EMG sensors and their locations to carry out EMG-based intention recognition and to control the assistive device by differentiating between the hand poses.

In this paper, we present an electromyography (EMG)-driven assistive hand exoskeleton for spinal-cord-injury (SCI) patients. We developed an active assistive orthosis, called Maestro, which is light, comfortable, compliant, and capable of providing various hand poses. The EMG signals are obtained from a subject’s forearm, post-processed, and classified for operating Maestro. The performance of Maestro is evaluated by a standardized hand function test, called the Sollerman hand function test. The experimental results show that Maestro improved the hand function of the SCI patients.