A key British regulator said on Monday, a deal between Google’s artificial intelligence (AI) firm DeepMind and the UK’s National Health Sevice (NHS) “failed to comply with data protection law.”
Google acquired DeepMind in the year 2014 and struck a deal in 2015 with the Royal Free NHS Foundation Trust. This Trust runs a number of hospitals in Britain. A wide range of information related to health from 1.6 million patients is accessed by the Google Company, this is according to the full agreement which was exposed by the New Scientist in April 2016.
The basic aim of this deal was to help the DeepMind cultivate an app called Streams with the aim of observing patients with kidney disease. The Streams would alert the right clinician when a patient’s condition weakens. But it is revealed by the New Scientist that DeepMind would be getting access to other health information such as whether a patient had HIV as well as details of drug overdoses, which stimulated a lot of arguments.
The UK’s data protection watchdog known as The Information Commissioner’s Office (ICO) launched its probe into the DeepMind-NHS deal in May 2016. The ICO released its conclusion and found that the agreement “failed to comply with data protection law.”
Information Commissioner Elizabeth Denham said in a statement, “Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been for more transparent with patients as to what was happening.”
“We’ve asked the Trust to commit to making changes that will address those shortcomings, and their co-operation is welcome. The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used.”
The ICO took the issue that the patients were not informed about how their data is would be used. The ICO’s letter to the Trust said:
“The processing of patients records by DeepMind significantly differs from what data subjects might reasonably have expected to happen to their data when presenting at the Royal Free for treatment.”
“For example, a patient presenting at accident and emergency within the last five years to receive treatment or a person who engages with radiology services and who has had little or no prior engagement with the Trust would not reasonably expect their data to be accessible to a third party for the testing of a new mobile application, however positive the aims of that application may be.”