gms | German Medical Science

56. Jahrestagung der Deutschen Gesellschaft für Neurochirurgie e. V. (DGNC)
3èmes journées françaises de Neurochirurgie (SFNC)

Deutsche Gesellschaft für Neurochirurgie e. V.
Société Française de Neurochirurgie

07. bis 11.05.2005, Strasbourg

Novel user interaction paradigm for medical augmented reality

Neues Interaktionsparadigma für Augmented Reality in der Medizin

Meeting Abstract

  • corresponding author D. Freudenstein - Klinik für Neurochirurgie, Eberhard-Karls-Universität, Tübingen
  • J. Fischer - Forschungsgruppe Visuelle, Computergestützte Medizin, Eberhard-Karls-Universität, Tübingen
  • D. Bartz - Klinik für Neurochirurgie, Eberhard-Karls-Universität, Tübingen
  • M. Neff - BrainLAB AG, Heimstetten
  • M. Tatagiba - Klinik für Neurochirurgie, Eberhard-Karls-Universität, Tübingen
  • F. Duffner - Klinik für Neurochirurgie, Eberhard-Karls-Universität, Tübingen

Deutsche Gesellschaft für Neurochirurgie. Société Française de Neurochirurgie. 56. Jahrestagung der Deutschen Gesellschaft für Neurochirurgie e.V. (DGNC), 3èmes journées françaises de Neurochirurgie (SFNC). Strasbourg, 07.-11.05.2005. Düsseldorf, Köln: German Medical Science; 2005. Doc11.05.-16.04

Die elektronische Version dieses Artikels ist vollständig und ist verfügbar unter: http://www.egms.de/de/meetings/dgnc2005/05dgnc0259.shtml

Veröffentlicht: 4. Mai 2005

© 2005 Freudenstein et al.
Dieser Artikel ist ein Open Access-Artikel und steht unter den Creative Commons Lizenzbedingungen (http://creativecommons.org/licenses/by-nc-nd/3.0/deed.de). Er darf vervielfältigt, verbreitet und öffentlich zugänglich gemacht werden, vorausgesetzt dass Autor und Quelle genannt werden.


Gliederung

Text

Objective

The graphical overlay of additional medical information over the patient during a surgical procedure has long been considered one of the most promising applications of augmented reality (AR). While many experimental systems for AR in medicine have reached an advanced stage, they usually depend on specialized hardware. Such dedicated systems, often are not suited for use in daily clinical practice. The authors present a new user interaction paradigm for creating operation plan sketches directly on the patient based on a novel medical AR application (ARGUS).

Methods

We have developed a technique for providing comprehensive user interaction based on information delivered by an image guided surgery (IGS) system (VectorVision; BrainLab, Heimstetten, Germany). A standard tracking tool is used for triggering actions by indicating menu markers located at previously defined positions and for definition of points or more complex shapes in a three dimensional (3D) environment. The ARGUS software continually downloads the position and orientation of the tracked tools from the IGS device. Then it looks up the user-defined interaction tool, which is identified by a unique name string. Consecutive tool positions and orientations are compared in order to detect basic gestures and the triggering of menu actions.

Results

Our methods enables the user to define points and freely drawn shapes in 3D. Additionally it provides selectable menu items, which can be located in immediate proximity to the patient. This eliminates the need for conventional touchscreen- or mouse-based user interaction without requiring dedicated hardware Thus the surgeon can directly interact with the system.

Conclusions

The interaction modes supported by the presented method can be used for a wide range of applications. The menu system could provide support for changing the parameters of an advanced information display. Even conventional functions like loading the patient dataset or initiating the patient registration procedure could be triggered using our novel menu system.