Development of a Robot Assisted Ultrasound Based Radiation Therapy (USgRT)
PO-1607
Abstract
Development of a Robot Assisted Ultrasound Based Radiation Therapy (USgRT)
Authors: SEITZ|, Peter Karl(1)*[p.seitz@dkfz-heidelberg.de];Schwahofer|, Andrea(1);Baumann|, Beatrice(2);Bendl|, Rolf(2);
(1)German Cancer Research Center, Department of Medical Physics in Radiation Oncology, Heidelberg, Germany;(2)Heilbronn University, Medical Informatics, Heilbronn, Germany;
Show Affiliations
Hide Affiliations
Purpose or Objective
One of the central problems of radiotherapy is the position monitoring of tumors during irradiation. Therefore, imaging methods are searched in order to detect the tumor position during the irradiation. One approach to this is ultrasound, which requires a constant pressure on the patient surface. The aim of this work is to develop a robot application for respiratory and motion compensation with constant contact pressure. This is to be integrated into a medical image processing program for the registration of ultrasound images with planning data.
Material and Methods
The lightweight robot (Kuka lbr iwa 7 R800) was integrated into the image processing program Medical Interaction Toolkit (MITK) [1] via a newly developed interface. The detection of the transducer position was observed via an optical tracking system (Spectra, Polaris). Experimental setup is displayed in Fig. 1.
The main component is a sensitive respiratory and motion compensation movement, which was programmed based on the Robot API from KUKA. The desired contact force was set to 5 N. The movement was tested on subjects for different sound positions and breathing patterns.
MITK has been extended with filters for the spatial assignment of coordinates and the display of ultrasound images, ea. for creating 3D ultrasound volume by means of scanning movements of a 2D ultrasound head.
Results
Image acquisition was continuous due to respiratory motion compensation and the transducer could be repositioned and positioned by remote control. The applied and resulting measured force at the transducer was constant. The control concept and image representation are combined in a GUI, whereby the developed filter pipeline allows the simultaneous display of registered ultrasound images and planning data in real time (Fig. 2).

Conclusion
A reliable, universal method for motion compensation, under low force constraints, in robot-guided ultrasound imaging has been developed and successfully tested. This is a first step towards a safe, autonomous robot positioning on the patient. In addition, the positional image data could be integrated into a system, providing the basis for image registration for ultrasound-based radiotherapy.
[1] Nolden, M., e. al.: The medical imaging interaction toolkit: challenges and advances. International Journal of Computer Assisted Radiology and Surgery 8(4), 607–620 (2013). DOI 10.1007/s11548-013-0840-8