gms | German Medical Science

GMS Journal for Medical Education

Gesellschaft für Medizinische Ausbildung (GMA)

ISSN 2366-5017

How high are the personnel costs for OSCE? A financial report on management aspects

project medicine

Search Medline for

  • corresponding author Thea Rau - Universitätsklinikum Ulm, Klinik für Kinder- und Jugendpsychiatrie/Psychotherapie, Ulm, Deutschland
  • author Jörg Fegert - Universitätsklinikum Ulm, Klinik für Kinder- und Jugendpsychiatrie/Psychotherapie, Ulm, Deutschland
  • author Hubert Liebhardt - Universitätsklinikum Ulm, Klinik für Kinder- und Jugendpsychiatrie/Psychotherapie, Ulm, Deutschland

GMS Z Med Ausbild 2011;28(1):Doc13

doi: 10.3205/zma000725, urn:nbn:de:0183-zma0007256

This is the translated version of the article.
The original version can be found at: http://www.egms.de/de/journals/zma/2011-28/zma000725.shtml

Received: July 16, 2010
Revised: October 22, 2010
Accepted: November 8, 2010
Published: February 4, 2011

© 2011 Rau et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by-nc-nd/3.0/deed.en). You are free: to Share – to copy, distribute and transmit the work, provided the original author and source are credited.


Abstract

Objective: The OSCE (objective structured clinical examination) was put to the test in order to assess the clinical practical skills of students in medical studies in the medical faculties. For the implementation of an OSCE, a large number of personnel is necessary. In particular for subjects with limited resources, therefore, efficient cost planning is required. In the winter semester 09/10, the Department of Neurology at the Medical Faculty of the University of Ulm introduced the OSCE as a pilot project. A financial report retrospectively shows the personnel expenses. The report is intended as an example for an insight into the resources needed for the OSCE with simulated patients.

Method: Included in the calculation of the financial costs of the OSCE were: employment, status of staff involved in the OSCE, subject-matter and temporal dimension of the task. After the exam, acceptance of the examination format was reviewed by a focus group interview with the teachers and students.

Result: The total expenses for the personnel involved in the OSCE amounted to 12,468 €. The costing of the clinic’s share was calculated at 9,576 €. Tuition fees from the students have been used to the amount of 2.892 €. For conversion of total expenditure to the number of examines the sum of 86 € per student was calculated. Both students and teachers confirmed the validity of the OSCE and recognised the added value in the learning effects.

Conclusion: The high acceptance of the OSCE in neurology by both students and teachers favours maintaining the test format. Against the background of the high financial and logistical costs, however, in individual cases it should be assessed how in the long-term efficient examination procedure will be possible.

Keywords: OSCE, clinical and practical examination, staff


Introduction

With the amendment to the German licensing regulations for doctors (ÄAppO) in 2002, an increase in the proportion of practical training, and at the same time a focus on practical examinations within medical studies, was fixed [1]. As required by these guidelines, teaching formats were adapted and an increasing number of practical components were integrated into many medical faculties. In accordance with the type of training, in a further step examination formats were adapted and the number of clinical-practical examinations increased [2]. In 2006 the demand for precise examination methods was taken up in Heidelberg by the committee for examinations of the Society for Medical Education (Gesellschaft für Medizinische Ausbildung (GMA) and the Centre of Excellence for Assessment Baden-Württemberg, and the need to adapt examination formats to teaching formats was presented in a policy paper entitled “Guidelines for Internal Faculty Proofs of Performance during Medical Studies” [3]. On the basis of the present generally accepted recommendations for the design of a high standard examination procedure, a discussion was set in motion in the Medical Faculty of the University of Ulm as to whether and to what extent a reorganisation of examination procedures should take place. It was decided that 2008 should be declared “examinations year”, and, financed with tuition fees, that the staff in the Dean’s Office for Examination Development should be given an additional post for a research assistant limited to 2 years [4]. It was further decided that the clinical and practical subjects should test the practical abilities of students after completing internships with an “OSCE” (objective structured clinical examination). A main factor in the choice of this examination format was that the OSCE is recognised as a standardised, valid and reliable method of examination and is already in use in many medical faculties [5], [6], [7]. Furthermore, it was possible to draw on experiences with this examination format as students had already been tested for many years with an OSCE both in emergency medicine and surgery in the medical faculty of the University of Ulm [8]. At the beginning of 2008, target agreements were sent by the office of the dean of studies to clinics and institutions. In them the dean of studies asked the clinics and institutions to prepare the adaptation of the examination format to the appropriate teaching format. Similar commitment was linked to the allocation of performance-related bonuses (LOM) for 2008, so that there was an incentive to make changes as quickly as possible.

The reform of the examination system was critically discussed in many institutes and clinics. As has been reported several times [5], [9], [10], [11], [12], [13], subjects with limited resources for teaching, in particular, were anxious about the immense effort involved in terms of personnel, and consequently also finance, when carrying out an OSCE. They at first showed reluctance to implement the examination method. In the Clinic for Neurology in the Medical Faculty of the University of Ulm, after a consideration of the expense in relation to the expected learning effect, it was decided to offer a practical examination for students. In the winter semester of 2009/2010, a pilot project took place with a group of 145 students. The project was evaluated by both teachers and students from an economic point of view with the emphasis on personnel control and acceptability of the examinations. Thus on the one hand a documented basis for optimisation of human resources for the OSCE in the neurological clinic is available. On the other hand the statement should establish a general idea of the personnel requirements for an OSCE with simulated patients, and thus provide a thorough decision-making aid for an economic examination form.


Method

The OSCE was planned and carried out in accordance with “Guidelines for Internal Faculty Proofs of Performance during Medical Studies” [3]. Two training course took place for the examiners before the OSCE in which on the one hand the OSCE was introduced as an examination procedure, and on the other a level of expectation for the examination questions defined together in advance were agreed upon. To this end the lecturer from the Clinic for Neurology demonstrated the clinical skills on a volunteer in a gradation ranging from minimum performance level to simply pass the stage, to a maximum performance level for very good results. The quality of performance was judged by the examiners, who each recommended a grade (ranging from 1-4). Correspondingly, an assessment (achieved/ not achieved) was made using a checklist and the sum of the points from the individual evaluation categories was compared with the grading. Finally, the assessments were put forward for discussion. Following the OSCE, examiners had a further meeting where the experiences with the individual examination tasks and the process of the exam were addressed. For the pre- and post-interview, four working hours (2 hours preparation/ 2 hours follow-up) were allotted.

The logistical tasks for the implementation of the exam were carried out by student assistants. In order to meet the demands of the pilot character, and to avoid delays in the procedure, the registration point for the examination was manned by twice the number student assistants. The evaluation of the examination results took place using “Klaus”, evaluation software of the company Blubbsoft GmbH Berlin (http://www.blubbsoft.de).

After completion of the exam, the effort of the people involved with the operational area and content of each particular assignment was recorded, and complemented by the temporal dimension of the assignment. In this way it could be seen which tasks the each person carried out within which timeframe. Performance was accounted for respectively in full hours.

In the statement on examination workload all work stages – from preparation through to implementation and evaluation of the OSCE were included.

For the calculation of personnel costs in Euros, an average was estimated (taking pay scales into consideration) from the salary code in line with each particular tariff agreement (see table 1 [Tab. 1]), as per employment status. In a further step, the average employer costs per year for individual positions were divided by the annual working time. The working time required for an activity was then multiplied by each gross hourly rate. For those persons involved who were paid according to basic fee rates, the hourly rate was multiplied directly by working time. The calculation was made exclusively on the basis of time-related employer costs. Performance-related pay was, if present, excluded, as well as additional personnel costs or further expenses such as additional pay for extra hours and allowances.

The statement on personnel costs for conduction the OSCE (exam development and monitoring) in the office of the dean of studies and in the Clinic for Neurology was divided into three planning phases (preparation, implementation and evaluation). Both the research assistant for examination development in the Office of the Dean of Studies and the research assistant for the simulated patient programme and the burden involved were financed by tuition fees.

The calculation for the deployment of student assistants, the deployment of simulated patients and of the examiners was accounted for as an overall budget. The attendance of training courses, the time it took to conduct examinations, as well as papers after completion of the examination course were included. The examiners’ salary codes are independent of the training and function of persons within their regular work (e.g. a position in the clinic). The calculation was made for the examination on the basis of these guidelines and the deployment plan in the Clinic for Neurology. Teaching staff with teaching commitments brought in from outside institutions for the examinations to simplify matters were remunerated according to the tariff agreement for doctors at university clinics (TA doctors), university clinics in the federal states, tariff area west. An average within the tariff for senior physicians was taken into consideration (Ä3). The logistical effort of the Clinic for Neurology’s administration was also recorded as a flat sum. Included here were the effort involved in preparing the examination and the evaluation of the exam results using the software programme “Klaus”.

The emphasis in this investigation was upon the calculation of personnel costs [14]. All necessary material costs1 were only taken into consideration if they were essential for the procurement of material relevant to personnel costs. This cost factor was visible in the statement on the content of work performance.

When assessing working time, the “pilot” character of the project had to be taken into consideration, which justified an increase in personnel for the logistical tasks, and included the development of the examination.

Description of the Examination Setting

The OSCE in neurology took place as part of the study of medicine within the course semester (8th semester). Students went through four examination stages (see Figure 1 [Fig. 1]) which were each occupied by one simulated patient. The problem definition included four different neurological examinations including the findings. The focus was on the examination of the patients. The exam tasks were chosen from the practical learning objectives of neurology. As a preparation for the exam, students on the Ulm learning platform medicine were provided with a handout. Videos with demonstrations of neurological examinations on patients were also available. Furthermore, students were given practical training as part of the teaching in the Ulm Clinic for Neurology and in cooperating institutions. Great emphasis had already been placed on consistent teaching. As both students and examiners were dealing with an OSCE in neurology for the first time, whilst an adaptation of both teaching and learning behaviour was occurring simultaneously, extensive preparation for the examination had to be provided. The Clinic for Neurology decided, therefore, to first test four selected examination tasks in order to gain initial experience with the examination format in a manageable framework. At the time of the examination, the students had no information on the content of their tasks, so that it can be assumed that they prepared themselves for all practical learning contents.

All examination stages were staffed by qualified examiners (specialist physicians, senior physicians, practicing neurologists). In four parallel courses (see figure 2 [Fig. 2]), a total of 145 examinees were tested on the examination date. The length of the exam per examinee was 20 minutes in total, four minutes for each stage with one minute exchange time.

The following functionaries from the respective institutions were involved in the examination process: Dean of Studies Office (2 research assistants from the examination development/ simulated patient programme), Clinic for Neurology (2 lecturers, administration, 15 doctors as examiners), five external examiners, 19 simulated patients, 19 student assistants.

Almost all the logistical tasks necessary during the process of the OSCE (registration, timing, etc.) were completed by student assistants.

Focus Group Interview

Directly following the OSCE, 40 students, ten per course (28 percent), were questioned about the exam in the form of a short interview (see figure 3 [Fig. 3]). Questioning took place in defined regular intervals in parallel in the courses and took place on an individual basis only. Furthermore, following the OSCE all involved teachers/examiners were questioned on the examination situation. Questioning was carried out by the Dean of Studies Office and the Clinic for Neurology. Both groups were questioned about the acceptability of the examination, the learning effect and about learner and teacher behaviour. For the interview situation key, questions of a discursive and dialogue-based nature were developed [15], [16].


Results

There is a detailed statement on personnel expenses involved in the development, implementation and evaluation of an OSCE, divided according to various aspects2. The overall expenses for personnel were 12,468 €. The amount for the clinic was calculated to be 9,576 €. Tuition fees to the amount 2,892 € were used for the fees for simulated patients and for both research assistants (examination development and simulated patient programme) (see table 2 [Tab. 2]). In the conversion of the overall costs for 145 examinees, an amount of approximately 86 € per examinee was calculated.

If we divide the OSCE into three planning phases, then the following amounts were required: for planning 4,677 €; for implementation 5,625 €; for post-processing 2,166 € (see table 2 [Tab. 2]).

For the use of simulated patients in the examination incl. training (training in two groups), the sum of 1,485 € was calculated (see table 3 [Tab. 3]). The sum of 6,541 € was calculated for the use of examiners in the OSCE (see table 4 [Tab. 4]). With two hours preparation and processing respectively at a cost factor of 3,290 €, the amount was about as high as that involved in implementing the examination.

The amount for supervision and monitoring was in total 3,340 €. 30 working hours were taken up with exam preparation in the clinic, almost as much time as was required for the development of the exam (see table 5 [Tab. 5]). The deployment of student assistants incurred costs for training and the deployment of personnel of 946 €, which were born by the clinic (see table 6 [Tab. 6]). For he administration and evaluation of the examination using the programme “Klaus”, and for other material acquisition in the area of responsibility of the clinic administration, 156 Euros were calculated for eight working hours (see table 6 [Tab. 6]).

Focus Group Interview

All those students questioned experienced the exam situation as pleasant, and emphasised the added benefit of the examination thanks to the practical exercises before the examination. It was further stated that the examinations were prepared for in learning groups and the practical methods of investigation were practised on learning partners. In addition, practical knowledge was intensified by theoretical knowledge from text books. Students adjudged the examination overall to be fair and comprehensibly structured. The time allowed for the tasks in the separate stages was described as sufficient. The intensive preparation for the examination by teachers (video with neurological methods of investigation, practically orientated handout, practical instructions, and tutorials) was highly appreciated. All those students questioned were in favour of keeping the examination format.

The interview with 16 examiners also resulted in a similarly overall positive picture. In particular the structured form of the examination with the provision of checklists facilitated the fair and transparent assessment of examinees. The timeframe for the tasks and the evaluation of the examination was adjudged appropriate. A need was recognised for a stronger standardisation of the neurological methods of investigation in order in future to offer more uniform practices at the various training locations. The average for students’ exam performance was 7,64 points, with a possible total of 8 points. Students thus exhibited above average performance.


Discussion

The results of the evaluation show that the costs for an OSCE are high with a total expenditure of 12,468 €. The question for discussion is which areas should be reduced in the number of hours, or indeed could be cut, following the experiences gained from the pilot project, and for which tasks fewer qualified personnel could be used. In the event of the implementation of another OSCE with continued contact with the same persons, considerably fewer training schedules both for student assistants and for examiners will be necessary in the preparation phase, and thus the preparation costs will fall. In order to reduce costs further, it would also be possible with a detailed description of tasks to entrust the structure of the course to the student assistants and thus reduce the high expenses for qualified personnel. Furthermore, fewer personnel are required for the logistical work during the implementation of the examinations. For example, only two student assistants per course are necessary for exam registration and the timing of the exam stage changes. The proportion of costs for the review procedure on the exam content should, however, be retained, and for the optimisation of interrater reliability should in fact even be raised. The time required for consistent evaluation schemes through a review by external examiners requires sufficient development time, and it has been shown that sufficient time must be allowed for the establishment of consistent evaluation standards as an essential quality characteristic of examinations [3]. Differences in the evaluation procedures which exist due to the distance of the training centres and the individual methods of investigation can be reduced in this way, thus increasing the uniformity and also the legal certainty of performance evaluation.

In order to secure a common horizon of expectation for the evaluation of students’ performance, a direct demonstration of the task and exemplary evaluation are again advisable immediately before the exam.

It should also be considered whether the use of simulated patients with a budget of 1,485 € is appropriate to the content of the tasks. Stages with no communicative elements can also be staffed by persons with less training. It should be considered whether student assistants might not be used as simulated patients for the testing of neurological investigations. In as far as communicative elements should be integrated into the OSCE, considerably more time must be provided for the training of simulated patients.

If only the costs for the development of the exam and monitoring in the preparatory phase are taken into account, then the process of carrying out several exam courses proves successful. In this way the time involved in the purely logistical effort of training the examiners is reduced. However, it must then be guaranteed that enough examiners (in this case 16 doctors) can be used for the examination. As well was clinic personnel, it is consequently advisable to schedule in external personnel and, for example, external habilitated persons with teaching duties, too.

In order to achieve further savings in personnel costs for logistical tasks it must be further considered whether different subjects could be merged into one course. In this way the work involved could be reduced not only for one subject, but all preparatory work could be spread amongst several contact persons. It should nevertheless be remembered that the examination time per examinee, depending on the number of stages, increases, and a good coordination of subjects must follow in order that the pressure of examination is not set too high. For a smooth exam procedure planning, meetings between contact persons must also be incorporated. In as far as the subject areas do not take on the coordination themselves, personnel must be scheduled in for those tasks connected with it which must be additionally calculated. Here it is necessary to plan ahead and, for example, to set up central service departments for a simulated patient programme and exam development on an ongoing basis.

What is positive in the cost report is the low amount for personnel costs for the evaluation of the exam. The use of the evaluation tool “Klaus” proved successful. Thanks to the good training for the examiners it was possible to automatically evaluate the answer sheets almost without any need for adjustments, and then to transfer the responsibility for the calculation of grades to the clinic administration. Not least the intended reliability of the exam must be considered when calculating an OSCE. For a good exam, 10 to 14 OSCE stages are recommended [17]. As a result of the increase in the number of examination tasks resulting from an extension of the examination and the time required for preparation and processing for the pre- and post-review, more personnel per stage are required. The implementation of the OSCE alone results in an increase of approximately 5,233.40 Euros (Footnote), with a longer period of time for those involved (8 hours)3.

The times for information events on the logistical process of the OSCE, on the other hand, generally remained the same. For the electronic evaluation of the exams, an insignificant discrepancy of about 234 € occurs, depending of the number of exam papers.

When planning the examination, imputed sizes and quality demands of an exam must therefore be given good consideration, and potentially conflicting interests reconciled.

Results have shown that students rapidly adjust to the required course contents and thus the examination has a strong “assessment drives learning” effect. The evaluation of the exam confirms as a whole that the examination method is especially suitable for achieving valid examination results, and also has positive an effect on teaching, as a consensus was recognised due to increased coordination by the teachers on course content.

The high degree of acceptance of the examination format by those students asked and the above average exam results speak in favour of maintaining the examination format.

Calculation of Personnel Costs for an OSCE

The results show expenditure for personnel of an OSCE as planned and implemented at the Ulm Clinic for Neurology. On the basis of the data presented in this paper, it is possible to transfer these results according to a corresponding pattern. The results show personnel expenditure for an OSCE as planned and implemented in the Ulm Clinic for Neurology. On the basis of the data collected in this paper, a transfer to one’s own exam situation can result according to a predetermined pattern. For the preparation of a calculation, basic data have to be collected on the planned examination. These basic data allow for a concrete calculation of personnel costs, under the restriction of average values (see figure 4 [Fig. 4]).


Conclusion

With the retrospective quantative cost report for personnel, an efficient deployment planning for a further OCSE in the Ulm Clinic for Neurology is now possible where individual areas can be reduced within the overall volume. With the detailed documentation of the job steps in the individual planning phases, it is possible to make decisions as to whether the form of work carried out for the planning of the exam was necessary. With an exact distribution of tasks when conducting a new OSCE, a precise calculation adjusted to the monetary framework and individual needs can be achieved. With the help of this cost report, subject areas planning a pilot project are better able to estimate the financial framework of an examination with simulated patients. Using the calculation procedure (see figure 4 [Fig. 4]) and the tariff table (see table 1 [Tab. 1]), a calculation can be made which is adjusted to its own needs. The high approval rating of the examination and the positive exam results also create the motivation for a continued interest in this resource-intensive examination format.


Notes

1 At the Medical Faculty of the University of Ulm the costs for the deployment of student assistants and simulated patients are budgeted as “material costs” or “administrative expenses”. For this purpose both items were entered as personnel costs in the cost report.

2 Result: When using the data it must be remembered that to a large extent these were calculated on the basis of mean values. The actual costs, therefore, may lie well below or above the specified values, depending of actual employment status (e.g. pay scale) of the person. Only the use of student assistants and simulated patients could be precisely accounted for and could be used as a secure basis for further use. All other data serve under the restrictions referred to as guidelines.

3 The higher staff requirement was projected based upon the four-stage OSCEs. 20 examiners and 19 student assistants were calculated. After each round of examinations a break of ten minutes was provided.


Acknowledgements

With special thanks to Katrin Rudolf from the Personnel Department of the University Clinic, Ulm, for her help with calculation of the personnel costs.


Competing interests

The authors declare that they have no competing interests.


References

1.
Bundesministerium für Gesundheit. Approbationsordnung für Ärzte vom 27. Juni 2002 (BGBI. I S. 2405). Zuletzt geändert durch Artikel 71 des Gesetzes vom 21. Juni 2005 (BGBI. S. 1818). Berlin: Bundesgesetzblatt; 2005. S.1818.
2.
Kruppa E, Jünger J, Nikendei C. Einsatz innovativer Lern- und Prüfungsmethoden an den Medizinischen Fakultäten der Bundesrepublik Deutschland. Dtsch Med Wochenschr. 2009;134:371-372. DOI: 10.1055/s-0028-1124008 External link
3.
Gesellschaft für Medizinische Ausbildung, GMA-Ausschuss Prüfungen & Kompetenzzentrum Prüfungen Baden-Württemberg, Fischer (Korrespondenzautor) MR. Leitlinie für Fakultäts-interne Leistungsnachweise während des Medizinstudiums: Ein Positionspapier des GMA-Ausschusses Prüfungen und des Kompetenzzentrums Prüfungen Baden-Württemberg. GMS Z Med Ausbild. 2008;25(1):Doc74. Zugänglich unte/available underr: http://www.egms.de/static/de/journals/zma/2008-25/zma000558.shtml External link
4.
Universität Ulm. Verwendung der Studiengebühren der Medizinischen Fakultät 2007 bis 2009. Ulm: Universität Ulm; 2007.
5.
Chenot JF, Ehrhardt M. Objective structured clinical examination (OSCE) in der medizinischen Ausbildung: Eine Alternative zur Klausur. Z Allg Med. 2003;79(9):437-442. DOI: 10.1055/s-2003-43064 External link
6.
Schrauth M, Riessen R, Schid-Degenhard T, Wirtz HP, Jünger J, Häring HU, Claussen CD, Zipfel S. Praktische Prüfungen sind machbar. GMS Z Med Ausbild. 2005;22(2):Doc20. Zugänglich unter/available under: http://www.egms.de/static/de/journals/zma/2005-22/zma000020.shtml External link
7.
Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Brit Med J. 1975;1:447-451.
8.
Weißer FO, Dirks B, Georgieff M. Objective Structured Clinical Examination (OSCE). Notfall Rettungsmed. 2004;7:237-243.
9.
Cusimano MD, Cohen R, Tucker W, Murnaghan J, Kodama R, Reznick R. A comparative analysis of the costs of administration of an OSCE (objective structured clinical examination). Acad Med.1994;69(7):571-576. DOI: 10.1097/00001888-199407000-00014 External link
10.
Hodges B, Hanson M, McNaughton N, Regehr G. Creating, monitoring and improving an Psychiatry OSCE. Acad Psych.2002;26(3):134-161. DOI: 10.1176/appi.ap.26.3.134 External link
11.
Kelly M, Murphy A. An evaluation of the cost of designing and assessing an undergraduate communication skills module. Med Teach.2004;26(7):610-614. DOI: 10.1080/01421590400005475 External link
12.
Rotthoff T, Willers R, Siebler M, Lindner S, Scherbaum W, Soboll S. Partielle OSCE Prüfung zur Einsparung von Ressourcen. GMS Z Med Ausbild. 2008;25(1):Doc09. Zugänglich unter/available under: http://www.egms.de/static/de/journals/zma/2008-25/zma000493.shtml External link
13.
Reznick RK, Smee S, Baumber JS, Cohen R, Rothman A, Blackmore D, Bérard M. Guidelines for estimating the real cost of an objective structured clinical examination. Acad Med.1993;68(7):513-517. DOI: 10.1097/00001888-199307000-00001 External link
14.
Jansen, Th. Kompakt-Training Personalcontrolling. Ludwigshafen: Praktische Betriebswirtschaft; 2008.
15.
Meuser M, Nagel U. ExpertInneninterviews – vielfach erprobt, wenig bedacht. Ein Beitrag zur qualitativen Methodendiskussion. In: Bogner A, Littig B, Menz W (Hrsg). Das Experteninterview. Theorie, Methode, Anwendung. Opladen: Westdeutscher Verlag; 2002. S.71-93.
16.
Bogner A, Littig B, Menz W. Experteninterviews. Theorien, Methoden, Anwendungsfelder. Wiesbaden: Verlag für Sozialwissenschaften; 2009.
17.
Nikendei C, Jünger J. OSCE - praktische Tipps zur Implementierung einer klinisch-praktischen Prüfung. GMS Z Med Ausbild. 2006;23(3):Doc47. Zugänglich unter/available under: http://www.egms.de/static/de/journals/zma/2006-23/zma000266.shtml External link