Skip to main content

Privacy implications of AI-powered medical devices

Confidentiality is an important legal and ethical duty in the UK health sector, but the potential of artificial intelligence to improve patient care has raised some thorny questions about how...

Confidentiality is and has always been an important legal and ethical duty among medical practitioners.

Although it is not an absolute principle, as some information can be shared in certain circumstances, doctors’ personal and organisations' collective responsibilities for protecting patient data took on an added layer of complexity with the introduction in the UK of the EU General Data Protection Regulation and Data Protection Regulation (GDPR) in May 2018.

While doctors in the UK can refer to guidance from the General Medical Council (GMC) on patient confidentiality, which has been updated to reflect GDPR requirements, organisations that use medical data to produce and test products have a less clear framework for compliance.

Many life sciences companies rely on large reservoirs of medical data that they have collected for a particular purpose – usually to inform the development of a drug or device for treating or monitoring a specific condition.

The use of artificial intelligence (AI) or machine learning (ML) to analyse these pools of medical data more quickly and accurately has been one of the most noticeable trends in life sciences in recent years.

Many medical devices use technology to assist with treatment, keep track of health metrics or even spot trends.

They are now increasingly being used to make automatic decisions based on these patterns.

With AI and ML techniques, data can be deployed to improve both the primary purpose (the specific medical condition it has been collected to analyse) and the device itself.

Here, we look at what organisations need to consider when collecting and using personal data for medical device applications and how they can navigate increasingly onerous privacy obligations.

How does the law treat medical data?

From a privacy perspective, the kind of information required to inform AI-driven medical devices will almost inevitably be classed as "special category data" under GDPR.

Special category data is broadly similar to the concept of sensitive personal data under the UK's 1998 Data Protection Act, but one key difference is that GDPR includes genetic data and some biometric data in the definition.

In order to legally process special category data, a data controller must first identify both a lawful basis for general data processing under Article 6 of GDPR and a separate condition for processing special category data under Article 9.

What data do you need?

Developers of AI-enabled tools need to look carefully as what data goes into the decision-making process of their technology and consider what type and amount of data they need to achieve their stated purpose and how long they need to keep that data for.

They also need to think about their data subjects' rights to privacy and avoid disrupting these.

People have a right to transparency, which means the data collector has an obligation to tell the person or people who are providing the data what it is they want to do with it.

As with most questions about privacy, there is a balance to be struck with reference to the specific circumstances of each case.

Anonymisation is best

One important point to bear in mind is that GDPR, and therefore the need for consent, does not apply to anonymised data.

But the regulation is strict about what anonymity means.

In order to be truly anonymised under the GDPR, the data collector must strip personal data for ever of any elements that would enable an individual to be identified.

Masking or pseudonymising identity is not sufficient and there have been cases of organisations mistakenly thinking they have anonymised data, when the means they have used do not satisfy GDPR.

Remember, there is no "one for all"

One privacy issue that crops up frequently in medical devices is the question of what to do if the data you have collected for one particular purpose might be useful for a purpose other than that for which consent was obtained.

Such a situation may arise intentionally, if an organisation decides it wants to use its data reservoir for maximum commercial advantage; or accidentally, if the AI reveals correlations that the company may not have been looking for but which may be useful for developing another kind of treatment.

Article 5 of GDPR imposes a "purpose limitation" principle on personal data processing, stating that: “personal data shall be collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes”.

Companies can tackle this by:

  • Re-applying to the data subjects for specific consent to use the information for additional purposes;

  • Demonstrating that the new purpose is compatible with the original purpose for which the data was collected; or

  • Justifying that new purpose under an available Article 6 legal basis and Article 9 condition.

Legal bases

Satisfying the Article 6 and Article 9 requirements has to be done on a case-by-case basis and comes down to balancing people's rights to privacy against the benefits of using their data for a specific purpose.

Justifying the primary purpose of the device for the individual patient is not normally difficult.

Difficulties arise when the data is being used for other ("secondary") purposes, such as for the benefit of other patients – one of the main drivers of the use of AI/ML – or to improve the device.

"Legitimate interest" is the broadest legal basis under Article 6. For medical devices, making a case that there is a legitimate interest should be relatively straightforward, even for these secondary purposes (and even when there is a commercial benefit), on the grounds that information from one group of patients can be used to help improve diagnostics or treatment for others.

However, this interest needs to be balanced against the rights of the original patient.

Consent and Article 9

Since GDPR came into force, most people have become familiar with "consenting" to digital privacy notices.

When it comes to medical devices, the question of consent can be slightly more complicated.

If you are a doctor treating a patient, you might assume that because you have consent to administer treatment, this extends to use of the patient's data.

But the kind of consent you get for treatment is not necessarily to a GDPR standard.

There are several reasons for this, but one of the main ones is that the consent might not be freely given – for example, if the patient is suffering from a life threatening condition that needs to be treated urgently, they will probably accept any kind of treatment.

A great irony of modern data protection law is that it is harder to get consent for data collection and use purposes than it is to get consent for medical treatment.

However, Article 9 of GDPR permits medical data to be processed without specific consent, subject to certain strict conditions.

Research occupies a privileged position within the regulation and under Article 9, organisations may process even special category data for research without consent of the data subject(s), as long as they implement appropriate safeguards.

AI-enabled medical devices may in some circumstances benefit from the scientific research exemption, given their ostensible aim to improve outcomes for patients.

Old pre-GDPR data

Another issue established life sciences businesses with rich lakes of sometimes very old data often face is how to use data collected before GDPR came into force – especially when consent was relied upon under the old regime.

Pre-GDPR, medical researchers had relatively wide-ranging permission to access and use data records based on consent that does not meet the very high standard of the new regulation.

Understandably, companies and organisations will be reluctant to delete these records and start from scratch, even if that is technically possible in some AI/ML environments.

However, they cannot continue to use this information on the same terms as they did before GDPR.

In this instance, companies have the option to re-apply to the data subjects for GDPR-compliant consent.

Alternatively, they can think about adopting a phase-out approach, replacing pre-GDPR data records with new GDPR-compliant data sets.

Right to be forgotten?

Article 17 of GDPR deals with data subjects' right of erasure (or the "right to be forgotten").

This issue comes into play when organisations wish to retain medical data for future use, against the wishes of the data subjects.

The right to be forgotten is not an absolute right, however; it depends on the legal basis on which the data was collected and processed in the first place.

It would not apply, for example, where the organisation relied on "legitimate interest" and there was an overriding ground to keep the data – perhaps for scientific research.

In the case of medical devices, removing a single person's information from the data set may be impractical if it compromises the efficacy of the entire tool.

Nevertheless, developers of AI-enabled tools need to consider from the outset the practicality of deleting individuals' data from a system if they receive a request to do so.

Communication is key

As discussed above, data subjects have a right to know why an organisation is collecting and processing their data and what they intend to do with it.

This is dealt with by Article 13 of GDPR, which requires data collectors to provide a privacy notice at the point of data collection.

For highly technical AI-powered medical devices, coming up with an explanation that the average person who might benefit from the tool will understand is especially difficult.

As such, the content and delivery of a privacy notice cannot simply be an afterthought, but requires careful consideration – in the first instance and also in relation to any updates.

The jury is still out on what constitutes the most effective kind of privacy notice, but consensus seems to be that the shorter and more dynamic a notice is, the more likely people are to read it.


For the medical device industry, GDPR has added another layer of compliance to what was already a heavily regulated field.

AI has great potential benefit millions of people by revolutionising diagnostics and treatment, but these benefits need to be balanced against people's right to privacy.

The ends do not always justify the means.

Organisations with huge reservoirs of historic data need not feel paralysed by GDPR, but they do need to think carefully about how they manage the transition of legacy data systems to new compliant formats.

Fieldfisher has come up with a checklist of six issues medical device developers need to consider when rolling out any new tool that relies on data:

  1. Can you anonymise any personal data?

  2. What legal basis are you relying on? Are you dealing with special category data and, if so, what Article 9 condition applies?

  3. Are you dealing with any data collected pre-GDPR?

  4. Do you need to rely on consent and, if so, is your language sufficiently clear and granular?

  5. Do you have an appropriate privacy notice?

  6. What are the particular data subject rights at stake and how are you meeting your obligations to protect these rights?

For more information about our sector-leading expertise in medical devices, privacy and technology, please contact the article's authors or visit the relevant pages of the Fieldfisher website.



Sign up to our email digest

Click to subscribe or manage your email preferences.