A new industry study highlights significant challenges in laboratory digitalisation, revealing that many scientists find current electronic lab notebooks (ELNs) fall short of modern research requirements. The research, conducted by Sapio Sciences, has surveyed 150 scientists across the US and Europe, finding that the limitations of legacy digital tools are driving an increase in shadow AI – the use of unauthorised, public AI tools for work-related tasks within pharmaceutical and biotech environments.
Fragmented data: from lab to supply chain
The research indicates that digital transformation has yet to resolve fundamental data-sharing issues, creating silos that threaten the entire product lifecycle. Currently, 65% of scientists are forced to repeat experiments because previous results are difficult to locate or reuse. This duplication not only inflates R&D costs but creates a knowledge gap where critical process data is lost before it can reach scale-up teams.
Failure to properly manage and contextualise data during the R&D phase creates a disconnect that prevents critical insights from successfully transitioning into manufacturing and the wider supply chain. With only 5% of respondents able to analyse results independently in their ELN, essential insights for downstream tech transfer remain locked behind manual, specialist-heavy processes.
“ The survey clearly shows a growing mismatch between modern scientific practice and the capabilities of traditional ELNs,” explains Mike Hampton, Chief Commercial Officer, Sapio Sciences. “When scientists can’t easily build on previous experiments without additional support, frustration turns into real cost.”
Security risks and regulatory compliance
Beyond workflow bottlenecks, the research identifies a growing security concern: 45% of surveyed scientists admit to using public generative AI tools through personal accounts to assist with their work. For pharma, this presents severe risks to intellectual property protection and regulatory compliance. When sensitive R&D data is entered into ungoverned public models, it bypasses the strict data integrity controls required by regulatory bodies, potentially compromising manufacturing validation.
“Scientists aren’t turning to public AI because they want to bypass governance,” says Sean Blake, Chief Information Officer, Sapio Sciences. “They’re doing it because existing lab tools can’t help them analyse results or determine next steps efficiently.”
Scientific demand for advanced integration
The study indicates a scientific consensus that laboratory technology must evolve beyond simple data documentation. Approximately 96% of scientists call for software that actively assists with data interpretation, while 78% express a need for voice-activated interfaces to facilitate hands-free work.
As pharma R&D costs rise, the average cost of developing a drug rose to $2.23bn in 2024, researchers will increasingly seek platforms that prioritise usability and native AI integration over traditional data storage. If data is not contextualised at the point of discovery, information will not be translated effectively into manufacturing and the broader supply chain – an outcome that all stakeholders will be keen to avoid.






