The ACTIVE project develops integrated multi-robotic platform for neurosurgery. A light and agile robotic cell and an advanced processing unit for pre- and intra-operative control will operate both autonomously and cooperatively with surgical staff. The project will advance research for robotic behavior-customization and provide scientific and technological impact in the field of adaptive robotic control. The quality and reliability of surgical robots will be improved, which will lead to an increase in the market for professional service robots in Europe and world-wide.
|Duration:||01.04.2011 till 30.03.2015|
|Grant number:||European Commission FP7-ICT-2009-6|
|Partner:||POLIMI (IT) (Coordinator), CNR-ITIA (IT), IMPERIAL (UK), KIT (DE), IIT (IT), TECHNION (IL), TUM (DE), DFKI (DE), DDEP (IT), TASMC (IL), Force Dimension (CH), Renishaw (IRE), Medimaton (UK), CFc (IT), KUKA (DE)|
|Application Field:||Assistance- and Rehabilitation Systems|
Intelligent Man-Machine Interface - Adaptive Brain-reading for assistive robotics (05.2010- 04.2015)
Signal Processing and Classification Environment written in Python
The ACTIVE project exploits ICT and other engineering methods and technologies for the design and development of an integrated multi-robotic platform for neurosurgery. A light and agile robotic cell with 20 degrees-of-freedom (DoFs) and an advanced processing unit for pre- and intra-operative control will operate both autonomously and cooperatively with surgical staff on the brain, a loosely structured environment. As the patient will not be considered rigidly fixed to the operating table and/or to the robot, the system will push the boundaries of the state of the art in the fields of robotics and control for the accuracy and bandwidth required by the challenging and complex surgical scenario.
The ACTIVE project outcome will be a novel robotic suite for awake patient surgery. A number of cooperating robots will interact with the brain that will deform for the tool contact, blood pressure, breathing and deliquoration.
Human factors are considered by allowing easy interaction with the users through a novel haptic interface for tele-manipulation and by a collaborative control mode. Active constraints will limit and direct tool tip position, force and speed preventing damage to eloquent areas, defined on realistic tissue models updated on-the-field through sensors information. The active constraints will be updated (displaced) in real time in response to the feedback from tool-tissue interactions and any additional constraints arising from a complex shared workspace.
The operative room represents the epitome of a dynamic and unstructured volatile environment, crowded with people and instruments. The workspace will be monitored by environmental cameras, and machine learning techniques will be used for the safe workspace sharing. Decisions about collision avoidance and downgrading to a safe state will be taken autonomously, the movement of the head of the patient will be filtered by a bespoke active head frame, while fast and unpredictable patient motion will be compensated by a real-time cooperative control system.
Cognitive skills will help to identify the target location in the brain and constrain robotic motions by means of on field observations. Cognitive researches within ACTIVE, aimed at robotic behavior customization, provide scientific and technological impact in the field of adaptive robotic control, by improving system control strategy through experience and on-field learning.
The quality of surgical robots will be increased since the surgeon will be able to perform brain surgeries (for epilepsy in the primary scenario, with possible follow-up in DBS and tumors removal) with awake patient. This is crucial in order to better identify eloquent areas to preserve during tissue removal.
The potential number of patients for epilepsy surgery is approximately 10.000 to 24.000 patients per year in Europe. The ACTIVE project is aimed at strengthen professional service robotics market in Europe, allowing its possible exploiter RENISHAW to position its Neurological department in the world market. FORCE DIMENSION and KUKA will act as provider of robotic components for OEMs which integrate these components into their medical systems.
During the project DFKI and its partners have developed and tested different methods. First is the analysis of the situation in the operating room for work-flow analysis and in particular to automatically switch control modes of the robot. Second is the analysis of EEG data within the pre-surgical step to locate epileptic relevant areas in the brain. For analysis and comparison between different methods we used the machine learning and data analysis framework developed at DFKI called pySPACE.