gms | German Medical Science

Deutscher Kongress für Orthopädie und Unfallchirurgie (DKOU 2018)

23.10. - 26.10.2018, Berlin

Intelligent, context-aware myoelectric hand prosthesis

Meeting Abstract

Suche in Medline nach

  • presenting/speaker Jeremy Mouchoux - Applied Surgical and Rehabilitation Technology Lab, Dept. for Trauma Surgery, Orthopedics and Plastic Surgery, University Medical Center Göttingen, Goettingen, Germany
  • Stefano Carisi - Department of BioMechanical Engineering, Faculty of Mechanical, Maritime and Materials Engineering, Delft University of Technology, Delft, Netherlands
  • Marko Markovic - Applied Surgical and Rehabilitation Technology Lab, Dept. for Trauma Surgery, Orthopedics and Plastic Surgery, University Medical Center Göttingen, Goettingen, Germany

Deutscher Kongress für Orthopädie und Unfallchirurgie (DKOU 2018). Berlin, 23.-26.10.2018. Düsseldorf: German Medical Science GMS Publishing House; 2018. DocPT11-1003

doi: 10.3205/18dkou575, urn:nbn:de:0183-18dkou5751

Veröffentlicht: 6. November 2018

© 2018 Mouchoux et al.
Dieser Artikel ist ein Open-Access-Artikel und steht unter den Lizenzbedingungen der Creative Commons Attribution 4.0 License (Namensnennung). Lizenz-Angaben siehe http://creativecommons.org/licenses/by/4.0/.


Gliederung

Text

Objectives: The loss of a hand is traumatizing as it makes it difficult for an individual to interact with his environment in a natural and simple way. The myoelectric prostheses are state-of-the-art functional replacement for the lost limb. Their overall mechanical complexity and dexterity is unfortunately not fully exploited due to inherent limitations of myoelectric control (low bandwidth, limited intuitiveness and robustness), which is also the main reason for high user dissatisfaction. We address this problem by advertising a more natural approach to control actuated (myoelectric) prostheses, which is based on emulation of the biological processes. Namely, we endow the prosthetic hand with artificial proprio- and extero- ception that renders it aware of user's environment and intention, so to ultimately react intelligently and in accordance with them. We achieve this by fusing inputs from a multiplicity of sensor modalities, such as: myoelectric, inertial, visual and haptic.

Methods: The system employs a color-depth camera located on the user's head and sophisticated computer-vison algorithms to understand the users' environment (exteroception) in real-time. Therefore, the system is aware of all objects that are found in the intermediate user's vicinity. Likewise, by using infra-red markers the prosthesis position is tracked in respect to user and his environment (proprioception). In this specific implementation, we utilize the system to preshape (i.e., adjust flexion, rotation, grip type and aperture) a dexterous prosthesis during the reaching phase of grasping tasks. In addition, we close the control loop between the user and the prosthesis by means of visual feedback rendered in augmented reality through holographic glasses worn by the user.

Results and conclusion: The developed system can infer user's intention during reach for an object in a complex scene and can also assist her/him by adapting the orientation of the wrist, the grip-type and the aperture of the prosthesis according to the properties and position of the targeted object (context intelligence). The visual feedback provides the user with information regarding the state of the system and the scene perceived by the artificial-exteroception system. Importantly, the system is not fully autonomous since the prosthesis movements are ultimately decided by mixing the commands sent by the autonomous controller with the ones acquired directly from the user through the myoelectric interface.

The context-aware prosthesis intelligently supports the user when she/he is grasping an object. We believe that this will reduce the cognitive effort, increase the prosthesis dexterity and overall control robustness. The system will be evaluated clinically on transradial amputees.

We acknowledge financial support by the German Ministry for Education and Research (BMBF) under the project INOPRO.