The ambition of SARAS’ Project is to develop a solo surgery system able to support the surgeon during the Robotic Minimally Invasive Surgery (MIS) by executing tasks in an autonomous way.

Perception & Decisional Autonomy, Cognitive Control, Advanced Planning & Navigation and Human-Robot Interaction are the core technologies of SARAS, enabling the implementation of the 3 different and increasingly complex projects platforms.


The cognitive architecture for decision making will support the surgeon during Minimally Invasive Surgeries by: (i) executing tasks in an autonomous way, (ii) providing advanced feedback to have a better understanding of the procedure, and (iii) proposing solutions, virtual fixtures and interventional checklists to improve the safety of the intervention.


  1. the Perception module will process raw sensor measurements, merge heterogeneous multi-modal sensor information and register pre- and intra-operative data in order to reconstruct a dynamic 3D model of the surgical cavity. This will be used by the cognitive module to infer on the status of the surgical procedure and predictions on its future evolution.


2. Based on the outcomes of the Perception module, the SARAS Cognitive module will: (i) recognise in real-time what the surgeon is doing, to react in the proper way, (ii) be aware of the current stage of the surgical procedure, (iii) generate sensible guesses on what has to be done next, (iv) decide whether anomalous events are taking place, (v) based on all these elements, make a decision on what course of action to implement next.


3.In the SOLO-SURGERY and LAPARO2.0-SURGERY platforms, SARAS’ assistive robotic arms have to move within the patient body, interact with anatomical structures and cooperate with the main surgeon. Once the SARAS Cognitive module has interpreted the scene, made decisions and selected the actions to be performed, the Planning module needs to plan the assistive arms’ trajectories to effect such actions and move the tools mounted on the robotic arms to cooperate with the laparoscopic tools teleoperated by the surgeon using da Vinci’s slave arms (SOLOSURGERY) or manually operated by the surgeon (LAPARO2.0-SURGERY).


4. The SARAS Human-Robot Interface will be multi-modal, intuitive and natural, allowing the surgeons to perform R-MIS by themselves with less training and lower mental fatigue. SARAS will provide advanced and integrated  force/tactile feedback and an integrated console with 3D vision, virtual fixtures, no-go regions, compensation of physiological movements, together with suggestions about the surgical procedure, updated workflow and interventional checklist.