A NEW study has found that revealing discordant AI results in mammography reports can significantly impact patient trust, anxiety levels and follow-up intentions. The research suggests that providing context alongside AI findings may help maintain confidence in radiologists and reduce unnecessary worry.
Artificial intelligence in mammography is increasingly used to support radiologists by flagging potential abnormalities. However, when AI findings conflict with a radiologist’s assessment, the resulting uncertainty can influence patient behaviour, highlighting the importance of how these results are communicated.
Study Highlights Rising Concerns
Researchers surveyed 600 women undergoing routine mammography across two academic centres in Milan, Italy, between January 2023 and January 2024. Participants, with a mean age of 55, were randomly assigned to one of four hypothetical scenarios: a standard radiologist report, AI in agreement with the radiologist, AI flagged as discordant, or AI flagged with an explanatory note. The study measured trust in the radiologist, anxiety, intent to seek second opinions, legal action consideration, and approval of AI integration. Results showed that when AI provided discordant results without context, trust dropped sharply from 90.1 to 73.0 on a 100-point scale. Anxiety increased from 16% to 58%, second-opinion requests surged from 8.7% to 50%, and legal action consideration rose from 38.7% to 60.7%.
Explaining AI Helps Patients Understand
Adding contextual explanations alongside discordant AI results significantly reduced adverse responses. Anxiety dropped to 25.3%, trust was largely preserved, and the tendency to seek second opinions or consider litigation was lower. Across all groups, approval of AI remained above 85%, suggesting patients remain generally receptive to AI support when results are clearly communicated.
Clinical Implications for Mammography
The findings emphasise the need for clear communication strategies in AI-integrated mammography. Providing context about the limitations and role of AI can prevent unnecessary anxiety, reduce the risk of unwarranted follow-ups, and protect against potential medicolegal issues.
While the study focused on hypothetical scenarios rather than real-world outcomes, the insights offer practical guidance for radiologists and healthcare institutions implementing AI tools. Clear, contextualised disclosure may serve as a simple but effective strategy to balance technological innovation with patient confidence and emotional wellbeing.
Reference
Pesapane F et al. Should AI results be disclosed in mammography reports? A randomised survey study of patient responses to concordant and discordant interpretations. Eur Radiol. 2026;DOI:10.1007/s00330-026-12405-x
Featured image: Samunella on Adobe Stock






