Safeguarding gone wrong? Project Alpha and the accidental weaponisation of personal data

Grafitti of a surveillance camera on a concrete wall with the text 'for your safety & our curiosity'.
Photo taken by Etienne Girardet.

The recent release of redacted data protection impact assessment  (DPIA) for a Met Police data collection scheme (Project Alpha) has suggested that the Met  police have been profiling children’s personal data on a large scale in an attempt to tackle gang violence. Admist allegations of implicit racial bias, invasions of privacy, and apparent lack of oversight of the project, Project Alpha has provided a great example of how personal data can be accidentally or deliberately weaponised in the name of safeguarding.  

What is Project Alpha?

Last week, the Guardian Newspaper published details of a freedom of information request regarding Project Alpha. Project Alpha is a Met police operation that started in June 2019 which aimed to ‘prevent loss of life and safeguarding against those most vulnerable and affected by gangs ’. What the released documents have suggested however, is that the project has used online information (such as social media profiles) to profile children on ‘a large scale’.

The Met police scheme, according to the document, was designed to ‘carry out profiling on a large scale’, with males aged between 15 and 21 the focus of the project. 

Social media uploads, such as drill music videos and other content deemed by the force as violent, was reviewed to identify the individuals featured in the content. Social media trackers would also be used extensively. The DPIA also claims that consultation with charity groups and the Home Office was undertaken before the project began.

Incorrectly completed or insufficiently protected?

The release of the document has forced denials from the Met police and outcry from youth privacy and human rights campaigners. The Met police have claimed that the DPIA was incorrectly filled out and that profiling is not being undertaken. Meanwhile human rights campaigns have blasted the violations of privacy, implicit racial bias, and potential effects upon vulnerable data subjects (under 18s).

In a statement on the leaked DPIA, Emmanuelle Andrews of the human rights group Liberty said: “This surveillance and monitoring of young people and children is deeply worrying, impacting their right to express themselves and to participate in friendship and community networks. It can have serious consequences for their futures, such as their ability to access housing, education and work.”

Youth violence experts have also since claimed that they had no involvement with the Project Alpha scheme, despite claims in the DPIA. Once such group (Redthread) identified in the document as one charity that the Met police service reached out, issued a statement in response stating that they were ‘not aware of any stakeholder consultation on Project Alpha’. Several others have also followed suite. 

The Met police have reiterated that the DPIA was incorrectly completed, however serious concerns have now been raised with regards to the oversight of Project Alpha and its creation.

What are vulnerable data subjects?

From a regulatory stance, the issue at hand here is not that the Met police may have been using social media to aid with law enforcement, indeed that is standard practice for most intelligence and police services. Rather, it suggests that the Met police failed to provide the appropriate considerations for these high-risk processing activities involving vulnerable data subjects (such as under 18s). 

The GDPR requires extra protective measures to be put in place when data processing activities either A) involve vulnerable data subjects, or B) are likely to create a high risk to the rights and freedoms of the data subjects involved. Indeed the extent of the latter is often decided by the involvement of the former. Where a high-risk is likely, the GDPR legally requires that a DPIA be completed.

The Project Alpha scheme, as described in the DPIA, involved multiple high risk processes (such as profiling) and large scale processing of vulnerable data subjects. If, as the Met police claims, the DPIA was incorrectly filled out, then the data of vulnerable data subjects has been being processed without a compliant DPIA document. If the document was accurately completed then the Met police has been processing children’s data without appropriate oversight.

A concerning track-record: The Gang Matrix and racial bias

Perhaps the most significant issue that this document has exposed is the implicit racial bias behind the Met’s processing.
The project’s aim to assess ‘harmful online content’, especially that ‘relating to gangs and [youth violence] online’ has been taken by many as an implicit targeting drill videos and of young black men. Stafford Scott, a veteran community campaigner, said he feared the project was part of a continued assault on young black people: “It is racially motivated, racially driven and involves racial stereotypes”.

This Project Alpha scandal follows a similar data protection scandal regarding the Metropolitan police’s gang matrix in 2020. 
In 2018 the Information Commissioner’s Office (ICO) found the gangs matrix was potentially breaking data protection laws by failing to distinguish between victims of crime and offenders. Moreover, the number of young black males contained on the Matrix was found to be disproportionate to their likelihood of criminal or anti-social activity. 

In a Guardian study over 40% of young people on the matrix list from Haringey in North London were found to have been scored as “zero” risk of causing harm; Yet they remained on the matrix. The ICO issued the MET with formal enforcement notices to improve and hundreds of names have since been removed from the matrix.

What emerges from these incidents is a pattern of data protection inadequacies by government bodies that have led to the implicit targeted profiling of young people, often from ethnic minority backgrounds.

Weaponising our data

Any large scale profiling of vulnerable data subjects is going to constitute a high risk data processing action. However the implicit racial bias combined with the Met police’s checkered track-record of handling these types of personal data must be seen to elevate this risk even more so. 

Whilst the GDPR and other data protection legislation has been designed to protect the rights and freedoms of vulnerable data subjects, Project Alpha provides a perfect example of the ways in which personal data can be weaponised by authorities against the very data subjects it is collected from.

If insufficient oversight is provided by the data controllers (in this case the Met police and Home Office), then it is up to us all as data subjects to ensure that our personal data is being adequately protected and processed. The work undertaken by investigative journalists and human rights campaigners forms a crucial part of the checks and balances of this personal data processing. 

About Us: Tacita are GDPR compliance experts. Tacita help clients achieve and maintain GDPR compliance. Get in touch to explore our range of GDPR services including the Tacita GDPR Audit, GDPR Consultant Service and the GDPR Toolkit.

Share this article:

Facebook
Twitter
LinkedIn
WhatsApp