In the UK health and social care sector, data and cyber threats are relentless. The number of breaches taking place is growing, causing significant disruption, for example, losing access to files or systems, or in the worst cases, losing sensitive personal data belonging to both staff and patients.
For example, in 2024, Synnovis, a provider of pathology services to several healthcare organisations, including the NHS, was the victim of a ransomware attack. This case demonstrated the significant, real-world impact that breaches in this sector can have on patients, as services were significantly disrupted across the UK.
Providers in the sector are being encouraged to use systems powered by Artificial Intelligence (AI) to increase efficiency and reduce backlogs. For example, new tools are available to provide administrative support, diagnostic assistance and demand forecasts.
Organisations that shy away from technological advances risk falling behind, yet introducing new tools, policies and third-party systems can increase exposure to cyber risks. The stakes are high in the health and social care sector, where any breach can compromise sensitive patient data and disrupt essential services. To move forward safely, providers must strengthen governance and build cyber resilience.
Managing data risks in a digital age
Due to rising cost pressures, Bring Your Own Device (BYOD) initiatives, in which staff use their personal devices for work, have been on the rise in the health and social care sector.
These initiatives bring greater risk, particularly as many staff report low levels of digital skills and confidence, leaving systems vulnerable to attack. Sensitive data is being handled by staff every day, but the systems, training, and support to protect it are not always in place.
Strict, clear data policies and training are needed to support the safe use of digital technology, specifically when BYOD initiatives are in place. This is the only way to protect against high-risk practices, such as staff using WhatsApp to share sensitive information or connecting to public Wi-Fi to access care systems. These practices could provide an opportunity for hackers to intercept and exploit the personal health data of vulnerable people.
Data protection impact assessments (DPIA) should be completed by providers to systematically analyse, identify and minimise data protection risks to meet UK GDPR requirements, protect patients and demonstrate compliance with data protection obligations.
Strengthening security across the supply chain
Another key consideration for health and social care providers when protecting patient data is supply chain security, as hackers have realised that a single supplier’s system could give them access to data owned by multiple organisations.
Organisations should undertake thorough third-party due diligence to validate systems and mitigate risk when looking to outsource work to an external company. Depending on the potential risks linked to working with a particular supplier, frequent and detailed due diligence checks may be appropriate.
Health and social care providers should ensure they have a holistic oversight of all suppliers from a data and cybersecurity perspective. For example, NHS Trusts must check that IT suppliers handling patient data have completed a data security and protection toolkit (DSPT), which is rooted in best practice principles aligned to the NCSC’s Cyber Assessment Framework (CAF).
In cases where personal health data is being input into a third-party AI model, health and social care providers need to use DPIAs to ensure data security risks are identified and steps are taken to mitigate them. Assessments should go beyond standard supplier cybersecurity checks, and the quality and reliability of AI systems must be reviewed carefully.
Ensuring effective AI governance
Crucially, where an AI model is being used to inform care outcomes, organisations must be able to verify that meaningful human oversight is embedded in the process, ensuring that AI enhances rather than replaces professional clinical judgement. In a social care setting, where there is less medical intervention required, there is a higher risk of people defaulting to the AI-proposed output, but it is vital that in all care contexts, AI serves as a supportive tool that is informative rather than determinative.
AI algorithms bring the potential for rapid information processing, with decision-making made with little to no human judgement involved. This increases the risk of bias, due to poor quality training data and a lack of transparency and privacy.
Where AI is used for automated decision-making (ADM), organisations must carefully assess the associated risks. While ADM can drive efficiency and streamline processes, it lacks emotional intelligence, clinical judgment, and the ability to consider context or nuance. Consequently, ADM should be avoided for high-risk or high-impact applications in health and social care settings.
Cyber threats are growing in scale and sophistication, and the real-world consequences of breaches highlight the critical importance of getting data security right. Providers can’t afford to be complacent. Completing DPIAs, adhering to DSPT standards, and ensuring human oversight of AI systems and models are essential practices to safeguard sensitive patient data and maintain compliance with UK GDPR.
As health and social care providers continue to adopt digital technologies to improve efficiency and patient outcomes, the need for robust data and supply chain protections has never been greater.
Key takeaways
- AI systems are increasingly being used in a health and social care setting to increase efficiency and support human judgement.
- Clear data policies and staff training are essential when deploying BYOD initiatives to reduce risk for patients and employees.
- With hackers increasingly targeting suppliers, ongoing due diligence across the supply chain is crucial, and health and social care providers should complete DPIAs, adhere to DSPT standards and assess third-party tools regularly.
For more information, please contact me.

/Passle/5f4626f28cb62a0ab4152da6/MediaLibrary/Images/2025-12-17-10-09-17-976-6942814d6438b978e7e6e97f.png)
/Passle/5f4626f28cb62a0ab4152da6/SearchServiceImages/2026-04-13-13-10-05-226-69dceb2d6d7258fe4b8adb2d.jpg)
/Passle/5f4626f28cb62a0ab4152da6/SearchServiceImages/2026-04-16-10-52-47-230-69e0bf7fa2269c83bb94274e.jpg)
/Passle/5f4626f28cb62a0ab4152da6/SearchServiceImages/2026-04-16-08-17-16-331-69e09b0c477862f605b22892.jpg)