TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Europe Data Protection Digest | Notes from the IAPP Europe Managing Director, 7 July 2017 Related reading: UK NCSC updates cyber assessment framework

rss_feed

""

""

Greetings from Brussels!

This week saw an interesting decision in the U.K. regulatory sphere. The U.K. Information Commissioner's Office found that the Royal Free London NHS Foundation Trust violated the U.K. Data Protection Act when it shared sensitive health information of 1.6 million patients with DeepMind, a Google subsidiary that employs artificial intelligence, according to an ICO news release.

If you’ll recall, I wrote an MD Notes on this partnership back in May 2016, when I asked the question if patients had signed up for the sharing of their health care data? The issue once again highlights how consent is obtained and patient rights are respected. Consent is a key concept in the provision of health care — this is true across ethical, legal and practical considerations. As it stands, the NHS is covered by "implied consent”; it does not require patient consent for direct treatment and care. The great unknown then was whether implied consent would be extended to fit the purpose of the scope of activity that DeepMind would undertake.

On Monday, the ICO Report — as a result of their investigations — found "a number of shortcomings" in how the Royal Free provided the personal data of patients to support the DeepMind app. Most notably, the report held that the Royal Free was not sufficiently open with the patients whose data was shared, meaning that they "would not have reasonably expected their information to have been used in this way." The Trust has been asked to commit to changes ensuring it is acting in line with the law by signing an undertaking. Speaking on the decision, Information Commissioner Elizabeth Denham said, “There is no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights.”

Following the ICO investigation, the Trust has been asked to adhere to four principles as follows: establish a proper legal basis under the Data Protection Act for the Google DeepMind project and for any future trials; set out how it will comply with its duty of confidence to patients in any future trial involving personal data; complete a privacy impact assessment, including specific steps to ensure transparency; and commission an audit of the trial, the results of which will be shared with the Information Commissioner and which the Commissioner will have the right to publish as she sees appropriate.

In a separate independent review, set up by DeepMind Health to "scrutinize and review" the existing agreement following criticisms last year, the review panel found that DMH had been insufficiently transparent in its original data-sharing agreement with the Royal Free and suggested it publish all future contracts with public sector bodies openly. However, the review also found that DMH did not breach the current Data Protection Act, as it was only acting as a "processor" of data rather than a "controller," as the Royal Free was, adding that the data controller — the party that collects the data — has more responsibilities than the party it passes that data on to. This is an interesting finding, considering that, with the changes to the law with the introduction of the GDPR, processors (third parties) will have more responsibility. Under the new rules coming into force next May, DMH might as well have been co-liable under the present ICO ruling.

Commissioner Denham put an interesting reflection on the case, stating that, while the partnership is using technology for good with the Trust having reported successful outcomes to date, some may reflect that data protection rights are a small price to pay for this. Denham holds that, through the results of the investigation, the shortcomings found could have been avoided. The message here is one of doing proper and timely data impact assessments, as well as a thorough analysis of whether the benefits are likely to be outweighed by the data protection implications in compliance with the law. Ultimately, it’s a case of using data and innovation in a diligent manner and knowing that, no matter how much value the technology can bring, you need to walk before you can run.

Comments

If you want to comment on this post, you need to login.