I've analysed the data protection impact assessment for the NHSX Isle of Wight App trial. It indicates very significant legal flaws. The paper I have written on it can be found here: https://osf.io/preprints/lawarxiv/6fvgh. I'll go through the main ones in this thread 1/n
1: The DPIA reads like a fight between PR folk wanting to say it is anonymous, and data protection folk needing to say legally, it is not. DPIAs are no place for PR. This data is not anonymous.
Here are three easy scenarios that show this data is really not anonymous (let alone the fact that it cannot be anonymous data in data protection law if there is a unique device identifier involved).
2: The DPIA states collecting personal data is always done voluntarily. It does not properly admit that this is not true: by design, the NHSX app works by other people uploading information about you, including third parties you were coloacted with.
3: The DPIA states there is no systematic monitoring of a publicly accessible area on a large scale (GDPR art 35(3)(c)). Every citizen is turned into a sensor. Evidence to Parliament talks about postcodes and hotspots. This needs to change.
4: The DPIA states that users will not be deprived of their data rights. But it then goes on to explain how they will. You will not be permitted access to your data because the app will not let you see your 'Sonar ID', and therefore they cannot find you in the database.
There is literally no reason for this. You emit data through Bluetooth all the time that NHSX will be able to reverse into your Sonar ID - this is not secret, or sensitive. It violated data protection by design, which obliges building in data rights, not designing them out.
5: NHSX also deny the right to erasure of data on the server, and do not provide a lawful basis on which this blanket refusal is occuring. You cannot refuse data deletion without a reason.
6: The right to object is missing, and they do not mention it once, despite it specifically applying to the lawful basis that NHSX are using (art 6(1)(e)).
7: There is no valid lawful basis laid out for automated decision-making. The COPI Regulations they rely on do not work for this: they authorise processing, not decision-making, and they are different thigns. There are other mistakes that relate to this.
8: The logic of the algorithmic systems and risk scores is embedded in a PDF in a Word file, which is turned into a PDF. You can't open it any more. It has to be provided under Article 13, by law.
9: NHSX state they are only 'briefing' the ICO, but given the nature of the inherent risks to their system, which cannot be fully mitigated, they need to engage in a process called Prior Notification (article 36, GDPR). They do not appear to be doing this from the DPIA.
10: It is unclear how they deal with ePrivacy law, and unclear if the trackers in the app are legal to use given this law.
NB: I could not analyse the risks, they are *all redacted*. There is no analysis possible of benefits/risks, nor any part of the DPIA that allows/tries comparison with other approaches to the same purposes. That's what a DPIA is for, and this public version fails on that front.
You can follow @mikarv.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: