Surveillance and social control: how technology reinforces structural inequality in Latin America
Picture: CC: BY (Kirill Sharkovski)-SA
This article was written by Jamila Venturini from Derechos Digitales. The original version (in Spanish) is available here.
How implementing social protection programmes that condition access to basic services to state and private surveillance exacerbate the prevailing inequality on the continent.
While the gap between rich and poor is increasing in the world, Latin America remains the most unequal region of the world. According to the Economic Commission for Latin America and the Caribbean (ECLAC), there are marked regional imbalances between different socioeconomic levels in areas such as life expectancy, infant mortality, illiteracy and access to water inside homes. Thus, the high inequality plaguing the continent directly influences the well-being of its inhabitants, their potential for development and exercise of their fundamental rights.
The implementation of programs conditioning access to basic services to state and private surveillance clearly exemplifies not only the fact that technologies are not neutral, but how they differently impact several human groups according to their gender, skin color and social class.
Taking advantage of the shortcomings of our legal systems and exploiting its grey areas, the tech industry working with governments has aggressively propelled a form of “technosolutionism” that is irresponsibly embraced by a political class driven by a diminished idea of progress. What is at stake is now is the potientiality for limited improvement, which could be achieved at the expense of the rights of those who have no choice but to undergo constant scrutiny, monitoring, control and discrimination.
Today, inequality is hidden behind a series of empty phrases - big data, algorithmic decisions, artificial intelligence - which, in the name of efficiency, attempt to normalise the biases under which they operate, making the systems that process them and the way they are used to regulate access to social programs, public transportationor attendance at popular events opaque to public scrutiny.
There are plenty of examples. In Argentina, the province of Salta signed an agreement with Microsoft in 2017 to use artificial intelligence to prevent teenage pregnancy and school dropout. According to the company, based on data collected among populations in vulnerable situations, "intelligent algorithms identify characteristics in people that can lead to some of these problems [teenage pregnancy and school dropout] and warn the government so that they can work on prevention." The data collected is processed by Microsoft servers distributed around the world and this processing specifically targets adolescents identified as people at risk, affecting not only their privacy, but also their autonomy and generating a wide potential for discrimination. It is, finally, a mechanism of control over targeted individuals in vulnerable situations who are exposed to interventions without their consent, and which reinforces the vulnerability of people who are deprived even of the possibility to decide on such interventions.
Although it could be argued that the data used for projection is voluntarily submitted, it is questionable that girls and adolescents affected by these measures - or their guardians - can give explicit consent considering the implications of providing specific information about their
sexual habits and potential pregnancy. It should be noted that Salta was the last Argentine province that ceased to provide religious education in public schools after a ruling by the Supreme Court, recognizing the existence of violations of the rights to equality and non-discrimination, as well as the privacy of citizens. This reliance on technology described above is thereforenothing more than an expression of broader problems to understand the areas of autonomy and privacy of people, with a political purpose.
In Brazil, the Ministry of Citizenship signed an agreement with the government of Salta and Microsoft to implement a similar program. In this case, in addition to the prevention of teenage pregnancy and school dropout, it is intended to anticipate issues such as malnutrition and diseases in early childhood. The country would be the fifth in the region to repeat the Argentine experience. In addition to the questions about informed consent and state access to sensitive information on populations in vulnerable situations, some questions remain unanswered, such as the other uses or predictions that can be extracted from these data and the potential risks, considering its processing by Microsoft and the governments involved in the program.
Chile began in 2019 its own pilot implementation of a tool that seeks to detect children and adolescents at risk. According to the Ministry of Social Development and Family, Alerta Niñez, it is a preventive instrument that "identifies the set of individual, family, environment and peer conditions of children and adolescents, which tend to occur when there is a risk of violation of their rights." Using the statistical processing of large amounts of data from public bodies, the system gives a score to individual children and adolescents based on their probability to suffer rights violation.
Although in this case the system has been developed by a local private university, it is again an invasive initiative to collect sensitive data of minors that carries a great risk of deepening prejudice and stigmatization towards historically vulnerable groups. In addition, these processes involve the transfer of personal data to third parties and the possibility that such data is used for purposes other than those agreed on; without legal bases or guarantees that the information generated will not be used in the future for other purposes, such as predictive policing initiatives for example.
Given that issues such as teenage pregnancy, school dropout and malnutrition are structural problems in the region, it is highly questionable that the associated policies are mediated or conditioned on the collection of large amounts of data. The fact that there is also no concern for the rights of children and adolescents, in accordance with existing human rights instruments throughout the region, shows a deeper problem.
Surveillance, control and exclusion
In Chile, the implementation of biometric identification systems in the national health system is of concern, due to the possible limitations in accessing basic health services that it could generate for marginalized and impoverished populations. This would also affect senior citizens, as their fingerprints can become harder to process.
The implementation of the so-called "Biometric System for Food Safety" in Venezuela requires citizens to verify their identity through their fingerprints to acquire food and hygiene products and medicine. It has led to complaints of discrimination against foreigners -documented and undocumented- and transgender people. The situation is particularly worrying given the circumstance of scarcity of essential goods and the humanitarian crisis that is worsening in the country, mainly affecting the rights to food and health of populations in the most vulnerable situations.
In São Paulo, the use of facial recognition cameras was implemented two years ago in the public transportation system, with the justification that they would help prevent fraud in the use of social benefits associated with transportation, such as discounts for seniors, students and people with disabilities. Since then, the system has blocked over 300 thousand cards allegedly used improperly, that is, not by their holders. At the same time, the Municipal Government has announced the total suspension of anonymous cards and has implemented measures to force their registration with unique and residential identification data. This type of measure can impact the access of unregistered persons - such as homeless people and immigrants - to the service. In a city the size of São Paulo, cards that allow discounted travel on different types of transportation are critical to get most of the population to work, school and cultural activities. Blocking or hampering access to transportation can have a major impact on people's lives and development.
In addition to creating limitations on access to public services for historically marginalized segments of the population, compulsory and biometric identification systems imply “oversurveilling” these groups. There is limited information on how the data collected is used, aggregated and shared, nor does it seem proportional to demand sensitive information for the delivery of basic services or social benefits. In the Venezuelan case, biometric databases come from the electoral system and are used both by state operators - including immigration officials and police officers - as well as by supermarket and pharmacy cashiers, without any prior legal requirement. In São Paulo, the municipal government came to announce the sale of the travelcard databases but, under public pressure and after the approval of a data protection law in Brazil, the government changed its position.
It is worth remembering that only the users of public health, social assistance and public transportation systems are subjected to these systems. Local elites who can turn to private providers therefore manage to maintain greater control over their information and preserve their privacy.
Inequality, discrimination and poverty
The fact that surveillance mechanisms differently targets the most vulnerable groups is not new, it goes back to the precarious social control processes that underpin many of our societies. Even today, with the possibilities offered by technologies to optimize the delivery of services of all kinds, we see that these technologies are being used to maintain an unequal social structure in which the exercise of rights is restricted to a small elite. With the help of technology, surveillance punishes vulnerability.
It does not have to be this way. The promise of technology is the improvement of our lives. That promise should be transversal to the whole society and not reserved for those who can afford the improvements or who can pay the price of not having to undergo abusive uses of technology.
A fundamental rights approach with an intersectional understanding of the different types of exclusions that technologies promote or end is the only way to tackle the inequality that millions of people are facing on the continent. Only then, new technologies may become a factor that helps close the gaps we face now.