Motion-based Grasp Selection

Marcus Gardner

The loss of a hand provides a huge disability due to its high versatility and ability to manipulate objects with speed and accuracy. The development of myoelectric hand prosthesis has given amputees the opportunity to return to a degree of normality by giving the ability to grasp different objects and to carry out a multitude of day to day activities. Despite improvements in mechanical design, electronic components, battery life and weight reduction, underlying control strategies have not changed.

 

Traditional Control 

The latest commercialized myoelectric hands offer simple electromyography (EMG) based human machine interfaces capable of providing on/off and proportional control to open and close the hand using a pair of EMG electrodes embedded into the prosthesis forearm at opposing muscle groups. A variety of predefined grasp patterns are used in conjunction with the EMG sensors to produce different grasps by using a physical switch on the back of the hand or a unique sequence of open/close EMG signals. This method has proven to be a reliable approach, however is still severely limited in functionality in comparison to a human hand. Only a limited number of grasps can be preprogrammed at one time, greatly inhibiting utility. Current grasp selection strategies have a steep learning curve, and with only visual feedback available, myoelectic hand control can be a significant cognitive burden on the user.

There has been a large interest in adapting pattern recognition techniques to interpret useful features in the EMG signal to control more degrees of freedom. These methods have the potential to provide direct control over hand prosthesis, however have not progressed clinically due to the high variability of the signal due to factors such as changes in arm posture, electrode alignment and level of fatigue whilst increasing the learning required by the user. Without a clinically viable solution for direct control, there is potential to improve the traditional approach by offloading the cognitive burden of grasp selection elsewhere. the first IMU-based control strategy to utilize natural motion that is aimed at hand prosthesis control.

 

A Sensor Suite for Autonomous Control

The proposed system integrates different sensor technology to form a complete sensor suite to offload the cognitive burden of the prosthetic hand to a passive autonomous control architecture. The system integrates computer vision, inertial motion and muscle vibration sensors to autonomously predict and select the desired grasp pattern on the prosthetic hand. The envisioned architecture is currently separated into 4 subsystems:

1) Grasp Activation: Grasp activation involves the use of a pair of microphone-based mechanomyogram (MMG) sensors to discriminate between opening and closing the hand.

2) Object Recognition: An object recognition system consisting of an inexpensive webcam fixed to the forearm is used to classify objects into different classes. Each class contains a set of grasp patterns that can be used to interact with the object.

3) Grasp Prediction: Once the object has been classified, the corresponding grasp set is used in the grasp prediction system, which selects the grasp based on the position of the reaching hand.

4) Object Interaction: A Bebionic V2 by RSLSteeper is used as the prosthetic device for object interaction. It has 14 selectable grasp patterns with 8 accessible at one time. With a manual opposable thumb, each position can be programmed with up to 4 grasps using two switching mechanisms; a physical switch on the back of the hand, and a digital switch triggered by sending an ’open’ signal through the EMG sensors when the hand is already open.

Co-Investigators

Professor Khoo Boo Cheong, Mechanical Enginering, National University of Singapore, Singapore

Professor Etienne Burdet, Bioenginng, Imperial College London

Sponsors

Imperial College London-National University Singapore Joint PhD Program

RSL Stepper (industrial collaborators)

 

Multimedia

Video demonstrating vision based grip swithcing with our in-house mechanomyography sensors for open/close commands

See our videos page for a Narrated Video (cick hyperlink for youtube posting) summarizing the system origninally presented at the IEEE International Conference on Rehabilitation Robotics (ICORR)