Author: Roli B. Omamuli, EMJ, London, UK
Citation: EMJ Radiol. 2026;7[1]:27-32. https://doi.org/10.33590/emjradiol/970TBR76
![]()
INTERVENTIONAL radiology (IR) is a field where precision, timing, and technical skill converge, making the interventional radiologist comparable to a musician on stage. At the recent European Congress of Radiology (ECR) 2026 session titled ‘The Radiologist as a Performer: How AI Supports the Art of Intervention’, experts explored how augmented reality (AR), virtual reality (VR), robotics, and AI are redefining the art and science that is IR.
OVERLAYING REALITY: AUGMENTED AND VIRTUAL ENVIRONMENTS IN IR
Laetitia Sacenti, Research Fellow at the National Institutes of Health, Bethesda, Maryland, USA, opened the session by framing VR as “an immersive 3D experience though a head-mounted system.” VR replaces the real-world view entirely and has a wide range of applications for physician training, patient rehabilitation, and safety simulations. Simulations can be used to teach students catheterisation and has been shown to improve students’ technical skills.1 This immersive experience is invaluable as it offers a risk-free training environment. However, the benefit is not limited to physicians or those in training, as immersive VR environments may reduce stress during procedures or be used in PTSD exposure therapy.
While VR replaces your entire real-world view with a fully digital environment, AR overlays digital elements onto the real world without replacing it, acting to enhance the real-world image with digital information.2 Saccenti explained that, since radiologists already have 3D medical scan data, creating an AR view of the anatomy in question can be relatively straightforward. In IR, AR is used for 3D visualisation in operating rooms (for example, to help visualise scatter dose in fluoroscopy to help reduce occupational dose exposure),3 surgical guidance, collaborative work, and patient rehabilitation.
Key technical considerations include accurate registration. Ensuring the digital overlays are perfectly aligned with the real body in the correct position, size, and orientation is important to minimise errors. Another consideration is real-time needle tracking, which keeps the physician aware of the needle tip throughout the procedure. Head-mounted AR systems are hands-free and offer true 3D vision, with the potential for remote team collaboration. Saccenti noted that navigation and guidance systems are most relevant for IR, mainly for image and needle guidance. Early clinical studies demonstrate promising results: Solbiati et al.4 reported high targeting accuracy during AR-guided thermal ablation of 15 liver lesions, achieving complete tumour ablation in all lesions and >90% coverage of a 5 mm periablational margin in 13 of 15 cases, with no intra- or periprocedural complications.
AR is also being implemented on more ubiquitous devices like smartphones and monitors. Saccenti highlighted her team’s work on a 3D-printed smartphone needle guide, which aligns AR overlays with patient anatomy using either 3D models or CT slices.5
Despite the potential of AR, challenges remain that must be addressed before routing clinical adoption, including individual calibration, gesture-based training, precise registration on a moving body, workflow integration, and cost and availability constraints. AI-assisted tools may be the answer to some of these roadblocks for things such as deformable registration, enabling context-relevant information in real time, and enhancing procedural precision.
ROBOTICS: HIGH-PRECISION TOOLS FOR SAFE INTERVENTION
Kornelia Kreiser, Head of Neuroradiology, University Hospital of Ulm, Germany, who also holds an advisory role for Mentice, Gothenburg, Sweden, emphasised that current robotics in IR are better described as a “remote control” or “programmable high-precision tool,” rather than an autonomous robot. This distinction is important, because, while they are programmable and capable of sensing their environment, they require continuous human control and cannot carry out actions automatically.
In non-vascular interventions, such as biopsies and ablations, robotic systems are often table-, floor-, or patient-fixed and can plan needle trajectories with real-time adjustment to tissue movement.6 According to Kreiser, some systems even allow needle manipulation without X-ray guidance, reducing radiation exposure. As Kreiser noted, some of these robotic systems can “specify the precise needle position in terms of the puncture site, depths, and the correct angulation and tilt.”
Vascular interventions, by contrast, remain more limited. Procedures require multiple preparatory steps, including sheath insertion, catheter navigation through complex arterial pathways, and manipulation of microcatheters and wires. These are tasks that most current robotic systems cannot fully automate. The dynamic and pulsatile nature of blood vessels, combined with highly variable patient anatomy, may further complicate robotic control. As Kreiser notes, robotic assistance in vascular IR can improve precision for parts of the procedure, but cannot yet replace the manual skill required for the full intervention.
Newer systems, such as LIBERTY® (Microbot Medical Inc., Hingham, Massachusetts, USA),7 offer disposable, low-cost, single-wire interventions, while SENTANTE™ (Sentante, Kaunas, Lithuania) allows control of multiple devices via a simulator-style interface. LIBERTY® simplifies workflow by reducing the setup to a single arm and a disposable guide, while SENTANTE® allows operators to simulate catheter movements or use a control screen for enhanced precision.
The advantages of robotics in IR are clear: for patients, there is potential for reduced radiation exposure,8,9 shorter procedure times in complex cases, and improved precision. For staff, there is lower occupational radiation exposure due to faster procedures and, therefore, reduced physical strain from heavy lead aprons, as well as minimised human handling errors. Remote control capabilities also open the possibility of tele-interventions, potentially increasing access to procedures in underserved regions and during off-hours, like the da Vinci® (Intuitive Surgical, Inc., Sunnyvale, California, USA) surgical system has done in the last two decades.
Challenges remain, including high costs, logistical complexity, the need for general anaesthesia in most robotic procedures, and limited automation for vascular interventions. Nevertheless, Kreiser envisions a future in which robotic systems could carry out more autonomous steps, such as guiding a catheter from the groin to the carotid artery using imaging data, with integrated sensing to navigate curves and vessel walls safely.
TRANSFORMING THE IR TOOLKIT: AI FOR TRACKING, PLANNING, AND GUIDANCE
Framing IR as a time-dependent discipline, Marco Calandri, Associate Professor of Diagnostic and Interventional Radiology, University of Turin, Italy, described the interventional radiologist as a “performer,” operating not only in space, but in time. Unlike diagnostic radiology, which is largely based on static datasets and retrospective interpretation, IR requires sequential decision-making, real-time adaptation, and precise execution, where outcomes are directly shaped by each procedural step. As Calandri notes, a diagnostic report describes findings, whereas an IR report narrates the procedure, underscoring the importance of timing and progression.
Within this dynamic framework, AI is increasingly redefining the ‘instrument’ of the operator, supporting multiple stages of the interventional workflow. A particularly well-established application is lesion segmentation, where convolutional neural networks and U-Net architectures enable automated identification of tumours and organs at risk. Wasserthal et al.10 demonstrated robust segmentation of over 100 anatomical structures, reducing inter-operator variability and enabling standardised volumetric assessment, thereby improving reproducibility across clinical practice and research.
AI is also being applied to procedural planning, particularly in the selection of optimal needle pathways. In a proof-of-concept study by Kisting et al.,11 AI-generated puncture paths for lung biopsy were found to be concordant with expert physician decisions and were considered safe, although prospective validation remains necessary. Despite these promising findings, Calandri suggested that real-world adoption remains limited, placing AI-guided planning in what he described as the “peak of inflated expectations.” In contrast, stereotactic-based planning is a more mature process, less dependent on AI. As demonstrated in the STEREOLAB trial, this approach uses a 3D coordinate system derived from CT or MRI to guide precise probe placement, forming part of a standardised workflow that includes advanced imaging, stereotactic guidance, and ablation confirmation.12
Tracking and navigation represent another critical area, particularly given the challenges of respiratory motion, organ deformation, and target displacement during procedures. Studies have shown that respiratory phase and motion can significantly impact procedural accuracy and complication rates, highlighting the need for consistent tracking strategies.13 AI offers potential solutions through motion modelling, real-time lesion tracking, and continuous target updating. However, Calandri noted that in many complex interventions performed under general anaesthesia, respiratory motion may be less problematic, and real-time imaging modalities such as ultrasound can already provide effective guidance. Therefore, Calandri positioned the domain of AI tracking closer to the ‘trough of disillusionment’, where technical potential exists, but clinical impact is still being defined.
Beyond tracking, AI is contributing to advanced navigation and trajectory planning through integration with electromagnetic14 and optical systems. These technologies enable features such as collision avoidance, vessel protection, multi-needle coordination, and predictive coverage simulation. While some of these systems are already commercially available, Calandri emphasised the importance of distinguishing truly AI-driven solutions from those based primarily on optical or electromagnetic control. He also highlighted the need to demonstrate meaningful clinical benefit and long-term financial sustainability.
A further focal development lies in deformable image registration for ablative margin assessment, a critical determinant of local tumour control. In the IAMCOMPLETE study, intraprocedural CT co-registration was feasible in most cases and enabled reproducible margin evaluation, although limitations remained in a subset of patients.15 Work by Lin et al.16 also demonstrated that AI-supported deformable registration not only increases applicability across cases, but also improves predictive performance for residual tumour and 1-year local tumour progression.
Importantly, these technical advances are now translating into clinical outcomes. The COVER-ALL trial showed that AI-based ablation confirmation software significantly increased minimal ablative margins compared with standard assessment (5.9 mm versus 2.2 mm; p<0.0001), with a corresponding trend towards reduction in local tumour progression at 2 years.17 This highlights the potential of AI not only to enhance procedural precision, but also to standardise quality across operators.
Calandri concluded that AI is fundamentally reshaping IR by enhancing the tools available to the operator. However, he cautioned that improved technology does not eliminate the need for expertise: better instruments require expertise, not improvisation. The broader challenge, he suggested, is not simply to enable exceptional individual performance, but to achieve consistent, high-quality outcomes across all practitioners.
Taken together, the perspectives shared by Saccenti, Kreiser, and Calandri illustrate how IR is evolving into a technologically enhanced performance, where visualisation, precision tools, and intelligent systems converge. AR and VR expand how operators perceive anatomy, robotics refines how they act within it, and AI supports decision-making across each stage of the procedure. Yet, as these technologies continue to mature, their value will depend not only on technical capability, but on meaningful clinical integration and operator expertise. As such, the future of IR may not lie in replacing the performer, but in equipping them with increasingly sophisticated instruments to deliver more consistent, precise, and accessible care.





