Producing real change: key highlights of our 2024 results

Since the beginning of 2024, we’ve achieved some wins.

Long Read

In 2024, Privacy International (PI) continued to produce real change by challenging governments and corporations that use data and technology to exploit us.

Since the beginning of the year, we’ve achieved some wins and would like to share the most recent ones with you.

Creating change is hard, and takes time. We have to uncover problems, draw attention to them, and pressure for change. In the latest quarter, we’ve been able to compel disclosure on algorithms used in immigration cases, and find out more about where police get data on us all. Sharing what we find with you, our partners, and institutions are key steps to stopping data exploitation, gaining stronger safeguards, and holding companies and governments to account.

Take a look below for a quick overview of the results we produced or contributed towards, by season.

Autumn 2024

Revealing the UK’s use of algorithms in Immigration and Law Enforcement

As part of our approach to change, among other things we uncover and expose various violations and data-abusive practices by governments and companies.

This autumn, we obtained documents from the UK Government’s Home Office confirming their use of the "Identify and Prioritise Immigration Cases” (IPIC) algorithm. It is designated for making recommendations about conditions for individuals on immigration bail, deportations (referred to as ‘returns’), and EU Settlement Scheme cases. After analyzing these disclosures, we are calling on the Home Office to provide more transparency regarding the existence and functions of their algorithmic decision-making systems, and their impact on those affected by these decisions.

Additionally, we accessed new information about how the UK police use data held by data brokers like Experian and Equifax, as a result of FOIA requests to the Police and Ministry of Defence of the United Kingdom. We are currently processing this information and will provide our analysis shortly.

What this means in short: We have proven that the Home Office uses the IPIC algorithm for immigration decisions about people, and that the police access people’s data from data brokers like Experian and Equifax. We call for transparency and increased scrutiny regarding how these systems operate and how they affect people’s rights.

UN reports reflect our positions on the use of technology in Education and Protests

United Nations reports serve as important reference points for policymakers and government officials to create fair and transparent policies.

This autumn, several high-level UN reports directly referenced and reflected our positions. Human Rights in the Administration of Justice, which highlights how human rights are upheld within the justice system, pointed to the unlawful use of surveillance against protestors, as mentioned in our submission. A key report on Artificial Intelligence in education not only referenced our submission but also adopted our recommendation that companies providing AI systems to educational institutions “make their technologies fully auditable by any third party”

We are pleased that our positions have been taken into account by the UN and will help to establish better global human rights standards regarding the use of technology in various areas.

What this means in short: Reflection of our positions on the use of technology in Education and Protests in UN reports contributes to the promotion of higher human rights standards, which national governments should consider.

PI’s educational materials are helpful to other civil society organisations, data protection authorities, and academics
Over the last three months, a number of other civil society organisations, academics and data protection authorities expressed interest or directly used our materials for educational or communications purposes. Among the used materials were: our piece on Generative AI and surveillance capitalism, Our tech explainer on election technologies in Kenya, Digital Health explainer, our piece on Vertical tech integrations, our report on the West African Police Information System (WAPIS).

What this means in short: Our research and educational materials are not only a source of information, but also valued resources used by others to train their teams, better shape their demands, and prioritize their work.

Winter & Spring 2024

New EU regulation empowers consumers

On 17 January 2024, the European Parliament adopted the Directive on empowering consumers for the green transition. As a result of our advocacy the Directive reflects our demands, including that consumers will have information about the minimum period during which devices should receive security updates, accessible through a harmonised label.

What this means in short: Phones will be kept updated and secure for longer, meaning less tech gets thrown away, contributing to a more sustainable future.

Amazon and iRobot forced to terminate their merger

On 29 January 2024 Amazon terminated their plans to acquire iRobot. That decision came after the European Commission’s initiated a Phase II in-depth investigation and published its Statement of Objections pointing to potential harms of the merger onto competitors and consumers. We contributed to this result by making submissions the UK competition regulator and the European Commission. PI also obtained a ‘third person interest’ status in the review of the merger by the EC.

What this means in short: Big Tech companies are acquiring smaller companies to obtain market dominance - and also collect more data about you. This win shows Big Tech’s power can be constrained, and together we can stop this practice.

Russia’s law on access to encrypted communication violates human rights

On 13 February 2024, the European Court of Human Rights issued its judgment in the case of PODCHASOV v. RUSSIA. The Court ruled that “legislation providing for the retention of all Internet communications of all users, the security services’ direct access to the data stored without adequate safeguards against abuse and the requirement to decrypt encrypted communications, as applied to end-to-end encrypted communications, cannot be regarded as necessary in a democratic society”. PI intervened in the case by providing technical and legal analysis of encryption and its key role in the protection of human rights. The Court’s reasoning relies on and cites PI’s submissions. (Listen to our podcast about this fascinating case.)

What this means in short: Governments are always trying to undermine our abilities to use strong and secure encryption – and Europe’s court pushed back, setting an example for other courts across the world.

The Inter-American Court of Human Rights holds Colombia accountable for violating the right to defend human rights

In March 2024, the Inter-American Court of Human Rights issued a historic judgment declaring that the Republic of Colombia is responsible for human rights violations against several members of the Colectivo de Abogados y Abogadas José Alvear Restrepo (CAJAR) and their relatives.

This decision marks the first acknowledgment within the inter-American context of a state’s international responsibility for violating the right of people to defend human rights. This violation, according to the Court, was committed through secretive and unlawful intelligence activities, among other methods. The Court reflected in its judgment the positions presented in our intervention, alongside other organisations, represented by the International Human Rights Law Clinic at the University of California, Berkeley, in the case.

What this means in short: This decision challenged state surveillance of human rights defenders in Colombia, and won, helping them to do their vital work without fear of being unlawfully surveilled.

Regulator and Courts condemn UK’s GPS tagging of migrants

In the span of three months, following our interventions and complaints, two UK courts and one regulatory authority handed down rulings on the UK’s GPS tagging of migrants, dealing serious blows to the legality of the policy:

  • On 1 March 2024, the UK privacy regulator, the ICO, issued its decision on our complaint against the UK Home Office’s GPS tagging of migrants. The regulator found that the Home Office’s pilot of GPS electronic monitoring of migrants breached UK data protection law. The ICO also issued an enforcement notice and a warning to the Home Office for failing to sufficiently assess the privacy risks posed by the electronic monitoring of people arriving in the UK via unauthorised means.
  • On 12 March 2024, the High Court of England and Wales handed down the first court judgment on the GPS tagging of migrants, in which PI filed witness evidence.
    The court found that the Home Office had been unlawfully tracking the Claimant, Mark Nelson, with a GPS ankle tag for over a year. Having to wear a broken device was found to be a disproportionate interference with his right to private and family life.
  • On 15 May 2024, a London Administrative Court handed down its judgment in the case of ADL & Ors v Secretary of State for the Home Department. The case was brought by four people without British citizenship who had GPS tagging conditions imposed on them upon release from immigration detention at various points in 2022. The court found GPS tagging of migrants was unlawful in several respects and breached their right to private life.

Please find out more about our work systemically challenging the GPS tagging of migrants in our analytical material.

What this means in short: Regulator and courts agree with us that it’s wrong to treat migrants with such punitive and cruel tech.

UN Human Rights Committee raised concerns about a series of privacy-undermining initiatives launched by the UK government

Being party to the International Covenant on Civil and Political Rights (ICCPR), the United Kingdom regularly reports on its implementation. Ahead of the consideration of the eighth periodic report on the implementation of the ICCPR by the United Kingdom, PI submitted to the Human Rights Committee a series of key concerns in relation to: (1) GPS tracking of migrants; (2) changes to UK surveillance law (IPA 2016 amendments); (3) the then-pending Data Protection and Digital Information Bill; (4) Technical Capabilities Notices and the weakening of encryption; and (5) Facial recognition tech. As a result, the Committee raised the issues we included in our submission (with the exception of those around Technical Capabilities Notices), including concerns on the Data Protection and Digital Information Bill related to access to bank details for benefits claimants. Reflection of these concerns put additional pressure to the UK government to address existing issues.

What this means in short: UN mechanisms share our concerns with regard to privacy issues arising from the UK Government’s policies, and urge change.

Supported privacy-related legislative developments in South Africa

In a joint statement, with other civil society, we urged the South African Parliament to reject the draft General Intelligence Laws Amendment Bill 2023 (GILAB). We also submitted our comments to the Parliament. On 26 March, the National Assembly adopted the third version of GILAB incorporating many of our suggestions. A major win, aligned with our and South African civil society’s requests, was the removal of the provision allowing security vetting of non-profit organisations, churches, and their personnel. Additionally, GILAB has improved its regulation of mass interception with stricter controls on data management and protections, recognising the safeguards under the Protection of Personal Information Act. Yet the Bill, now before the National Council of Provinces for consideration, still has shortcomings that need to be addressed.

What this means in short: People in South Africa will benefit from a more human-rights compliant legislation regulating state intelligence.

EU Cyber Resilience Act includes better standards for digital products regulation

On 12 March 2024, the European Parliament approved the Cyber Resilience Act, which defines cybersecurity requirements and standards for “products with digital elements” (“PDEs”) placed on the EU market. Privacy International reviewed earlier versions of the regulation and proposed a series of potential adjustments.

The final text of the Regulation obliges manufacturers to ensure "effective handling of vulnerabilities” of PDEs for “no less than five years”; as well as notify authorities about identified vulnerabilities and serious cybersecurity incidents. These provisions were among advocacy points we promoted through our submissions.

What this means in short: Manufacturers and retailers are responsible for maintaining the security of their products (for at least 5 years), notifying authorities about identified vulnerabilities, and disclosing serious cybersecurity incidents. This means more protections for users and, potentially, longer use of devices.

The European Court of Human Rights condemns mass secret surveillance in Poland

Four years ago PI, together with Article 19 and EFF, intervened in the Pietrzak and others v Poland case to call out the Polish government’s unrestricted surveillance of communications data.
On 28 May 2024, the European Court of Human Rights in its judgment found that Poland’s mass secret surveillance powers indeed violated the right to privacy under Article 8 of the European Convention on Human Rights. It condemned Poland’s operational control regime, retention and use of communications data, and secret surveillance regime under the Anti-Terrorism Act. The Court concluded, among other things, that the requirement on information and communications providers to retain users’ communications data for potential future access by the authorities interferes with the right to privacy. It also noted that the relevant national legislation was not sufficient to ensure a proper protection of the right to privacy and thus was not “necessary in a democratic society”.
This judgment articulates the need to reform Polish surveillance regulations. It also sets a precedent for other European countries operating in a similar fashion.

What this means in short: We can limit governments’ vast surveillance powers. Polish mass surveillance of communications data and other aspects of its secret surveillance regime need to change. Other European countries with similar mass surveillance systems should comply as well if they don’t want their powers to be challenged before the European Court.

Summer & Autumn 2024

New standards for data protection in elections
In June 2024, the Council of Europe Convention for the protection of individuals with regard to the processing of personal data (Convention 108) adopted Guidelines on the protection of individuals with regard to the processing of personal data for the purposes of voter registration and authentication. These address questions about the data collected, processed and managed by official electoral management bodies and provide practical advice about how systems of voter registration and authentication should comply with existing standards. The final version of the document adopts and/or reflects PI’s positions and recommendations. The Guidelines will serve as a reference point for organisation and management of the election process in various countries across the world.

What this means in short: Local elections should include clear mechanisms for the protection of people and their data in the voter registration and authentication process. This means less abusive and fairer elections.

Please consider supporting us as we continue our work to protect privacy and human rights around the world.