gms | German Medical Science

61. Jahrestagung der Deutschen Gesellschaft für Neurochirurgie (DGNC) im Rahmen der Neurowoche 2010
Joint Meeting mit der Brasilianischen Gesellschaft für Neurochirurgie am 20. September 2010

Deutsche Gesellschaft für Neurochirurgie (DGNC) e. V.

21. - 25.09.2010, Mannheim

Gesture interaction with medical software for a neuronavigation device

Meeting Abstract

  • Till Kipshagen - Institut für Signalverarbeitung und Prozessrechentechnik, Universität zu Lübeck, Deutschland
  • Mathis Graw - Institut für Signalverarbeitung und Prozessrechentechnik, Universität zu Lübeck, Deutschland
  • Volker Tronnier - Klinik für Neurochirugrie, Universitätsklinikum Schleswig-Holstein, Campus Lübeck, Deutschland
  • Ulrich G. Hofmann - Institut für Signalverarbeitung und Prozessrechentechnik, Universität zu Lübeck, Deutschland
  • Matteo M. Bonsanto - Klinik für Neurochirugrie, Universitätsklinikum Schleswig-Holstein, Campus Lübeck, Deutschland

Deutsche Gesellschaft für Neurochirurgie. 61. Jahrestagung der Deutschen Gesellschaft für Neurochirurgie (DGNC) im Rahmen der Neurowoche 2010. Mannheim, 21.-25.09.2010. Düsseldorf: German Medical Science GMS Publishing House; 2010. DocP1820

doi: 10.3205/10dgnc291, urn:nbn:de:0183-10dgnc2915

Veröffentlicht: 16. September 2010

© 2010 Kipshagen et al.
Dieser Artikel ist ein Open Access-Artikel und steht unter den Creative Commons Lizenzbedingungen (http://creativecommons.org/licenses/by-nc-nd/3.0/deed.de). Er darf vervielfältigt, verbreitet und öffentlich zugänglich gemacht werden, vorausgesetzt dass Autor und Quelle genannt werden.


Gliederung

Text

Objective: Neurosurgical procedures in the operation theatre become more and more dependent on the availability and use of information from several imaging modalities. The surgeon-machine interaction is still performed by direct contact with the navigation system, e.g. by use of a remote control or a touch-screen interface. This is disrupting the surgeon‘s workflow. We propose a system that subsequently transforms real-time camera-frames of hand gestures into interface commands controlling the OsiriX display, thus eliminating the need to physically touch any input device. This enables the neurosurgeon to control the software by himself without compromising sterility requirements.

Methods: A stereo-camera setup consisting of two Unibrain Fire-I cameras is utilized to triangulate the position of the hand in 3D. An additional light-source, preferably IR-LEDs may be used to brighten up the scene. An existing prototype constructed from item system parts houses a 24“ iMac. Aluminum surfaces foster hygienic cleaning of the workstation. The camera frames are processed using a color-segmentation algorithm. The centroid of the hand is computed. Shape-classification uses the Fourier-Transform of the hand‘s contour and a nearest-neighbor method, allowing for recognition of at least five different pre-defined gestures. The image-processing step is based on open-source software (OpenCV).

Results: Our system is able of real-time processing and does not impose any special requirements on the environment and user. The detected hand-position differs from intended position less than 2 cm in 96% of all cases and less than 1cm in over 50%. The correct gesture is recognized in 95% (±3%) of all cases and handedness is not an issue. Intuitive gestures for commands like “zoom”, “rotate”, “slide through image-series”, “transpose” and “reset” are implemented.

Conclusions: We present a system that enables surgeons to control the OsiriX medical visualization software by gestures. But it is designed application independent and there are working versions for different platforms. The gesture-recognition works reliably under most lighting-conditions, and is invariant to any type of surgical glove color. The device can easily be extended to recognize new gestures. Currently, the system is tested under surgical conditions in the operation theater. In addition the system serves as input device for a neuronavigation device under development.