The UK’s data protection watchdog, the Information Commissioner’s Office (ICO), failed to hold both the government and private sector to account for repeated breaches of private medical data during the COVID-19 pandemic, a new report argues.
The ICO also failed to address obvious concerns raised by government outsourcing of the handling and processing of confidential medical information by for-profit corporations — including those linked to the UK and US national security state.
“Public health programmes were deployed unlawfully, and underpinned by negligent data governance”, the Open Rights Group (ORG) determined, after analysing the use of National Health Service (NHS) data in three major COVID-19 health programmes.
ORG’s findings, released on 25 May 2023, are especially worrying given the government’s current attempts to water down UK data protection rights and obligations with the Data Protection and Digital Information (DPDI) Bill.
The three programmes examined by the ORG are:
- NHS Test and Trace
- NHS Contract Tracing App
- NHS Datastore
It also compared the ICO’s conduct to that of other UK and European regulators.
The ICO offered a brief rebuttal stating that it “does not share” the ORG’s views and that its “priority” during the pandemic was to “ensure organisations understood how data protection law could facilitate action at a time of emergency.” “The ICO achieved this by mobilising a dedicated task force and publishing prompt advice for organisations who were faced with using data in new ways,” it says.
Data Breaches and Policy Failures
Data protection breaches cited by ORG include: the leaking of private data onto social media by Test and Trace employees, private data being used to sexually harass women and data being lost after being contained in Excel spreadsheets.
The government’s failure to require Data Protection Impact Assessments (DPIA), when it outsourced data collection of personal information of millions of people, is among the key reasons such breaches were neither ‘identified’ nor ‘mitigated’ against, ORG argues.
ORG also notes the U-turn when shifting from a more expensive (£35 million) less secure and more intrusive centralised model of contract tracing to a decentralised and less intrusive model. This costly mistake, which delayed by months the government’s response to the pandemic, could have been avoided had it listened to digital security experts at the time and “taken privacy implications into due consideration.”
The COVID-19 app rollout fiasco could also have been avoided if it had not “ignored the ICO’s advice about a decentralised app,” ORG notes. The ICO failed to hold the government’s feet to the fire as the latter proceeded with its more intrusive centralised model, despite objecting to the model itself.
Privatisation of Confidential Medical Information
The UK government entered into contracts with for-profit companies such as CIA-linked surveillance and data analysis firm Palantir and artificial intelligence firm Faculty AI — in spite of the civil liberties, privacy and conflict of interest implications associated with allowing these entities access to large volumes of confidential medical data. The outsourcing of management, analysis and processing of information on the NHS COVID-19 Data Store, which includes confidential medical data provided by multiple government agencies, is one of the more controversial yet less discussed practices to emerge since the COVID-19 outbreak.
Corporations such as the firm Palantir maintain a “continued and growing involvement in the UK health service and wider public services,” after they were originally awarded data handling and analysis contracts on an ostensibly limited basis, ORG writes. In 2020, OpenDemocracy sued the UK government over its secret £23 million deal with Palantir, to process private medical data. The UK government ultimately acquiesced to the suit, “committing not to extend Palantir’s contract beyond COVID without consulting the public”.
Yet, as an OpenDemocracy reported in March this year, the government has breached its 2021 promise and “ordered” hundreds of NHS hospitals to “share people’s confidential medical records” with a “new central database” managed by Palantir software. The new database, “collects daily information about hospital patients – including their dates of birth, postcodes and detailed medical histories – that was previously held by individual trusts and shared less frequently,” OpenDemocracy explains.
Lord Theodore Agnew, who was Minister of State at the Cabinet Office representing HM Treasury, held a major share in Faculty at the time it was awarded the contract. Faculty offers a wide range of services to the public and private sector, including working with the Home Office and businesses in “combating terrorist propaganda” online.
Faculty is on the Directory of UK Safety Tech Providers endorsed by the Department for Digital, Culture, Media and Sport (DCMS). In 2021, the DCMS announced its “new strategy to fight online disinformation” by ensuring “teachers, carers, librarians and youth workers” will be “trained to help youngsters and disabled people spot disinformation online”. Part of this strategy includes the Online Safety Harms Bill which — as Truth Defence recently detailed — represents a mass-surveillance and mass-censorship agenda that risks obliterating the rights to private communications and to seek out and access information. Faculty, and other companies in its Safety Tech Directory, “will play an increasingly important role in supporting compliance with the new regulation,” DCMS says.
Other companies working with the government to analyse medical data include consultancy giants McKinsey and Deloittes.
The ICO’s failure to address the obvious concerns regarding the handling and processing of personal and confidential medical data, including via private no-bid contracts, forced civil society and independent media organisations — such as ORG, Foxglove, medConfidential, Good Law Project and OpenDemocracy — to attempt to fill the void.
Breach of Promise that Medical Data Would Remain Anonymous
Despite repeated assurances that the private medical data of millions of people would be anonymised, government documents revealed “that patients’ identifiable personal health data is indeed being processed” with Faculty having access to “sensitive personal data” and Palantir processing “pseudonymised personal data”, Phil Booth, coordinator of medical privacy organisation medConfidential, explained in an interview with this author in 2020.
“One of the more striking things these documents reveal is the shambolic mess and lack of accountability around what must be one of the largest aggregations of English patients’ medical histories in recent times,” Booth said. “The ‘sole data controller’ for the Data Store, NHS England, isn’t even the legal entity that formed the contract with the most controversial data processor in the set-up, Palantir – and there’s no public evidence of the sort of processes and procedures around access to data that other parts of the NHS have been doing routinely for years,” he added.
Booth also added that while NHS England “may have a lawful basis in the pandemic to hoover up, process and even hand over copies of our data,” it appeared to have “failed to grasp that it cannot do so in secret” and that “it must still comply with all of the rest of the law, including patients’ lawful dissent.”
The appropriate and lawful handling of personal data is a key aspect of what the ICO was established to watch over. But it failed to take appropriate action to remedy NHS England’s breaches of its data handling obligations and to meaningfully interrogate the outsourcing of confidential medical data, especially to firms like Palantir and Faculty.
ICO’s Soft Touch Data Protection Failure
ORG excoriates the ICO’s failures to hold the government and the business sector to account, at a time when it was most needed. “The ICO acted as a ‘critical friend’ and did not enforce the law effectively,” ORG writes. In her strategic vision for 2017 to 2021, the ICO’s previous chief described the data regulator as a “critical friend” of “government and organisations”.
This attitude “led to these programmes falling short of important safeguards and data protection requirements” which in turn “exposed the public to significant risks and harms”.
The ICO also “stayed” complaints at the Information Tribunal, preventing people from pursuing data breach complaints over two months in 2020.
The ICO’s soft tough approach is “at odds with the approaches of other UK regulators” such as the Financial Conduct Authority and Competition and Markets Authority, as well as regulators in countries like Norway and France, ORG says.
“While there is no fault in acknowledging that public and private organisations were operating under difficult circumstances,” ORG writes, “the ICO’s approach was timid and lax, avoiding strong enforcement even when the government’s clear breaches of data protection law were causing harm to citizens.”
None of this bodes well for the future of data protection and the ability to access public information, given that the UK government is attempting to water down rights and obligations currently enshrined in UK data protection law. As Truth Defence reported last week, the DPDI Bill being debated in Parliament represents a full frontal attack on individuals’ rights to access information and to object to the sharing of their personal information as well as data handlers’ obligations to protect personal information. All of these rights and obligations literally took decades of campaigning to secure.