Non-fitted devices in the Home Office’s surveillance arsenal: Investigating the technology behind GPS fingerprint scanners
Expanding beyond GPS ankle tags, the Home Office has since Autumn 2022 been issuing so-called non-fitted devices (NFDs) to migrants who are on immigration bail and who are subject to electronic monitoring conditions. We undertook some technical research into NFDs to investigate the intrusiveness of this surveillance technology.
Introduction
With the ongoing expansion of GPS tagging under the UK Home Office's electronic monitoring programme, it has increasingly deployed non-fitted devices (NFDs) that track a person's GPS location and request frequent biometric verification in the form of fingerprint scans.
The NFDs deployed by the UK Home Office are small handheld devices with a fingerprint scanner that record a person's location 24/7 (referred to as their trail data). They alert the person at random intervals throughout the day requesting their fingerprint scan, which it then compares against a representation of the biometric information stored on the device. This is in order to verify that the person has the device with them at all times for the purposes of tracking. Through previous research we did in relation to the NFDs, we know that these alerts are sent to the wearer up to 5 times per day. Although as set out below, there are reports that individuals are in fact being sent many more alerts than this in a given day period. Those subjected to these devices have cited detrimental impacts to their daily life and mental and physical wellbeing, due to the pervasive and erratic nature of these devices.
In May 2024, the outsourcing company Serco took over management of the Electronic Monitoring contract, which had previously been operated by Capita's Electronic Monitoring Service. The technology for the NFDs was supplied by private company, Buddi Limited, until 30 December 2023. The Home Office has since clarified that the technology is now provided by way of the Jelly Star mobile phone, which now includes a direct dial to the monitoring centre. These are manufactured by the Chinese technology company, Unihertz. The Home Office has revealed that this technology will be subject to a phased roll out starting in August 2024 in order to replace the devices provided by Buddi, which is due to be completed by November 2024. The Home Office's Immigration Bail Policy (the Bail Policy) makes clear that the new devices will operate the same way as the ones supplied by Buddi (as above with the sole difference that the Jelly Star mobile phones can be used to call the monitoring centre directly). These identical capabilities appear to include the randomised generation of daily fingerprint alerts and the collection of GPS trail data on a 24/7 basis (this latter point is confirmed by the Equality Impact Assessment conducted in relation to the NFDs, which was last updated on 23 October 2024).
In accordance with the Home Office's Bail Policy, individuals subject to electronic monitoring are usually expected to spend some amount of time on a fitted device (i.e., GPS ankle tag) and then an NFD. They can be moved between the different types of devices through Electronic Monitoring (EM) reviews that must take place every 4 months, in response to representations submitted by the individual subject to the condition, and by request of another Home Office decision-maker.
Determining whether a person will move to a non-fitted device from a fitted device during an EM review will depend on the circumstances of the individual, as well as a range of factors including an opaque 'harm score' generated by the Home Office. The Bail Policy makes clear that those deemed by the Home Office to be 'lower harm' cases will be moved more quickly from a fitted device to a NFD. Similarly, the Bail Policy also states that certain vulnerabilities, including where there is medical evidence that electronic monitoring is having a serious impact on a person's mental health, may require consideration of whether an NFD is more appropriate than a fitted device.
The Home Office's Bail Policy states that officials determining whether an individual should be moved onto a NFD may have access to an automated support tool to provide recommendations. Through Freedom of Information (FOI) requests submitted to the Home Office, it became clear that this tool is called the Electronic Monitoring Review Tool. In an FOI response to a request filed by the Public Law Project (dated 5 April 2023), the Home Office said this tool had been in use since November 2022 and was being used in all EM reviews. This was contradicted by a later FOI response PI received on 9 September 2024 stating that the tool is no longer in use as of August 2023 notwithstanding the fact that the Bail Policy (dated March 2024) still refers to the possibility of using an automated streaming tool. As above, this response is inconsistent with the existing version of the Bail Policy that says caseworkers continue to have access to the tool.
Notably, the Bail Policy makes clear that the duration a person spends on an NFD is expected to be longer than on a fitted device. This appears to be based upon the presumption that an NFD is 'less intrusive' than a fitted device and therefore can allow for monitoring specific cohorts (in particular those deemed to present a lower risk of harm) over greater periods of time than ankle tags.
The Equality Impact Assessment conducted in relation to the NFDs also suggests that these may be more suitable than ankle tags for individuals suffering from particular health conditions - including mental health conditions exacerbated by social stigma associated with fitted devices.
However, as per research, including anonymous interviews with individuals subject to these devices, done by the Public Law Project, Bail for Immigration Detainees, and Medical Justice - NFDs are in fact also highly detrimental to individuals' wellbeing and even introduce new harms unique to the functionality of the device.
In particular, the research points to the fact that the randomness of the alerts as well as the associated constant requirement to provide one's fingerprints contributes to people feeling as though they are "in a constant state of alertness and in a heightened sense of being under constant surveillance". The research shows through the anonymised interviews that this in turn impacts the enjoyment of basic everyday activities - such as being able to sleep properly. Subjects also reported that having too little time (e.g., 1 minute) to provide their fingerprints contributed to the feelings of anxiety and stress they felt.
The possibility of the alerts going off at any time over lengthy periods of time (such as up to 12 hours) also appears from the research to be linked to an increased sense of social stigma that wearers experience as a consequence of being visibly monitored particularly where they receive an alert in a public place. While the devices described in the research were those supplied by Buddi, there is no reason to consider that these impacts would be any different given that, as above, the Home Office's Bail Policy and the Equality Impact Assessment still refer to the NFDs operating in the same way, including the random daily occurence of fingerprint alerts.
The technical research we have conducted using a similar device to those deployed by the Home Office showcases how tracking via the NFDs takes place in an unnecessarily intrusive and dehumanising way despite this technology being deemed as having a lesser impact on the wearer. Our research demonstrates how many of the documented negative mental health impacts from being subject to tracking via NFD, which one individual described as making him feel like a 'caged animal', are exacerbated by the design of the system being deployed.
Our research
Among other concerns, we highlighted several features of NFDs that individuals have reported as harmful and detrimental to their wellbeing and below we show how these are exacerbated by the technical design of the devices, and whether other design choices could be made to address them:
- Unreasonable and unforeseeable alert time periods across 12 hours or more (e.g., in the middle of the night or when visiting the shops or a place of worship).
The impact: Social stigma, anxiety about being constantly under surveillance.
The tech: Can the alerts on the devices be set between specific and defined periods of time? - Unreasonable alert response times (i.e. having too little time to provide ones fingerprints).
The impact: Unfair and unreasonable expectations for fingerprint scanning response times that work against the individual.
The tech: Can the tech customise how much time after an alert a person has to submit their fingerprints? - Unreasonable or otherwise too-strict match probabilities (e.g., requiring unrealistically high percentage matches, false positives, false negatives).
The impact: Unfair and unreasonable automated biometric matching metrics.
The tech: Can the tech customise a threshold of the biometric match score for more a reasonable pass/fail result? - High volume of alerts that can go off at any time (including in public).
The impact: This is linked to Point 1 above as a large number of alerts may also increase a feeling of social stigma and anxiety at being monitored at all times.
The tech: Can the number of alerts be set to a more reasonable amount (e.g., 3 times per day or even once per day rather than 5 times a day or more) to avoid stigmatising and dehumanising the subject?
Methodology and the device we used
To test the above technical capabilities, we sought to acquire a GPS fingerprint scanner with similar enough functionalities to those deployed by the Home Office - in particular, a device with features that included live GPS tracking and fingerprint matching. These kinds of devices can come in many different forms used in various industries, such as employee monitoring or census tracking. We chose a similar device to that supplied to the Home Office for the reason that the exact model deployed to track migrants may not be publicly available.
We decided on a GPS fingerprint scanner distributed by Chainway, used in both monitoring and employment contexts. The Chainway C66 is a biometric handheld terminal hosted on Android 11/13 and equipped with Qualcomm Octa-core CPU, 5.5" high-definition touch screen, 13MP autofocus camera and a detachable battery with a SIM card slot. On the back of the device is a fingerprint scanner, and the terminal has various functions ranging from near-field communication (NFC) to barcode scanning to fingerprint recognition to Wi-Fi connection, and it is also facial recognition compatible. The GPS location function is powered by three options between Wi-Fi, network and fused.
Key specifications:
- Dimensions: 160.0 x 76.0 x 17.0 mm / 6.3 x 2.99 x 0.67 in.
- Weight: 297 g / 10.47 oz. (device with battery)
- Display: 5.5-inch high definition full display (18:9), IPS IGZO 1440 x 720
- Touch Panel: Multi-touch panel, gloves and wet hands supported
- Power: Removable main battery 5200mAh, support QC3.0 and RTC
- Standby: up to 490 hours (only main battery ; WiFi: up to 470h; 4G: up to 440h)
- Continuous use: over 12 hours (depending on user environment)
- Charging time: 2.5 hours (charge device by standard adaptor and USB cable)
- Notification: Sound, LED indicator, vibrator
- Expansion Slot: 1 slot for Nano SIM card, 1 slot for Nano SIM or TF card
- Interfaces: USB Type-C, USB 3.1, OTG, extended thimble
- Audio: 2 microphones,1 for noise cancellation;1 speaker;receiver
- Keypad: 1 power key, 2 scan keys, 2 volume keys
- Sensors: Accelerometer sensor, light sensor, proximity sensor, gravity sensor
- GNSS: GPS/AGPS, GLONASS, BeiDou, Galileo, internal antenna
- WLAN: Support 802.11 a/b/g/n/ac/ax-ready/d/e/h/i/k/r/v, 2.4G/5G dual-band, IPV4, IPV6, 5G PA; Fast roaming: PMKID caching, 802.11r, OKC; Operating Channels: 2.4G, 5G, depends on local regulations; Security and Encryption: WEP, WPA/WPA2-PSK(TKIP and AES), WAPI-PSK—EAP-TTLS, EAP-TLS, PEAP-MSCHAPv2, PEAP-LTS,PEAP-GTC,etc
- WWAN (Europe, Asia): 2G: 850/900/1800/1900 MHz; 3G: CDMA EVDO: BC0, WCDMA: 850/900/1900/2100MHz, TD-SCDMA: A/F(B34/B39); 4G: B1/B3/B5/B7/B8/B20/B38/B39/B40/B41
- Bluetooth: Bluetooth 5.1
The fingerprint scanner specs:
- Sensor: TCS1
- Sensing Area (mm): 12.8 × 18.0
- Resolution (dpi): 508 dpi, 8-bit greylevel
- Certifications: FIPS 201, STQC
- Format Extraction: ISO 19794, WSQ, ANSI 378, JPEG2000
- Fake Finger Detection: Support by SDK
- Security: AES, DES key encryption of the host communication channel
The Chainway device is therefore an Android 13 device with a display screen that has the fingerprint scanner mounted to the back. By contrast the Buddi devices appeared to solely consist of the fingerprint scanner (with no display screen) and was no larger than the size of one's palm.
We note that the Jelly Star mobile devices to be rolled out in the latest phase are in fact closer to the ones we tested than the Buddi supplied technology, according to the user information about those Jelly Star devices that are publicly available. The Jelly Star devices are also Android 13 mobile phones with a display screen and a fingerprint scanner mounted on the back. They also appear to have similar capabilities including GPS tracking, SIM card, identical WLAN support, and facial recognition technology.
The setup
Setting up the device with our test subject's fingerprints was fairly straightforward. The fingerprint scanning app loaded onto the device simply requires the following steps for logging a user and their fingerprint, and everything else is saved and logged in the back-end:
- Enroll: We assigned a PageID (say, 1), and then placed our finger on the back sensor and clicked 'Capture' to store the image to the assigned PageID. Now we've created a subject (identified by PageID: 1) and their fingerprint.
- Identification: To verify this person, in the next 'Identification' tab we inputted our fingerprint's assigned PageID (in our example case, PageID: 1), then we placed our finger on the back sensor and clicked 'Authentication'. This returned the authentication result (a pass or a fail).
- Image: To view the image of the fingerprint capture, we placed our finger on the scanner and then clicked 'Start', after which we are able to see the saved image of the fingerprint captured and stored.
To set up the live GPS location feature, we slotted in the SIM card in the back compartment beside the battery pack, and then in the GPS location page in the app we tested all three location detection types - Wifi, fused, network - to locate our live coordinates. The latitude and longitude of the coordinates were unchanged among the three selections, though the altitude slightly varied across them. The display also showed the 'Time' it took to gather the coordinates and a live 'Status' showing whether the device was still 'locating' and then the location type (e.g., 'fused') when it has been determined. There was also the option for a 'Cold,' 'Warm,' or 'Hot' detection that likely dictates the intensity of the tracking.
The Software Development Kit
Now that we knew how the device worked for requesting fingerprint scans and tracking the subject's live location, the next step was to test how the design of the tracking system could be customised and compared against the features of the NFDs deployed by the Home Office.
Note that our test device came with a customisable software development kit (SDK), which is a set of software development tools that allows developers to build apps for a specific operating system - in this case, Chainway's Android fingerprint scanning hardware. The availability of the SDK allows us to customise the app to both perform identically to the Home Office devices (e.g., requesting fingerprints at random time periods, location tracking) and to test different mechanisms (e.g., limiting requests to provide fingerprints to certain time periods, setting a maximum number of alerts). We note in this regard that the customisations we made to the Chainway SDK (i.e., when to send pings, how many times, etc.) could be similarly applied to the Jelly Star devices through its SDK, or even through Android's built-in SDK (though the Android SDK does not contain anything for match accuracy, only a generic pass/fail).
Chainway's developing environment specs are as follows:
- Operating system: Android 11/13 (we selected Android 13, which is identical to that used by the publicly available Jelly Star devices); GMS, 90-day security updates, Android Enterprise Recommended, Zero-Touch, FOTA, Soti MobiControl, SafeUEM supported
- SDK: Chainway Software Development Kit
- Language: Java
- Tool: Eclipse / Android Studio
The SDK and its accompanying javadoc provided a full suite of functions and calls for the fingerprint scanning technology, including a match() function responsible for matching the fingerprint scans and assigning them pass/fail scores. We demonstrate below how the configuration of the NFDs deployed by the Home Office are highly intrusive in such a way that is detrimental to the rights and freedoms of individuals.
The code
We wrote pseudocode utilising the SDK's calls and functions provided in its javadoc. Pseudocode is essentially the logic blueprint of an algorithm written in plain English that a software developer would be able to refer to when building the app in syntax code.
We wrote four different functions to address the concerns against human dignity we highlighted earlier about the current impact of NFDs - ShouldWePing(), main(), SendPing() and overTime(). The code we wrote is generic, but seeks to match what we know about the Home Office's system as far as possible. We make clear where our customised device departs from the features we understand that the Home Office's NFDs have and where the software could be even further customised in a different way to how we have designed it in our experiment.
1. The ShouldWePing() function
Firstly, our ShouldWePing() function allows the developer to set specific periods of time for the fingerprint notifications to alert the device holder to scan their fingerprint. The first 'IF' function establishes a random probability of when to ping (a representation of the Home Office's randomness mechanism) - if the generated probability is deemed 'this is the time', the device then moves onto the next 'IF', which checks if it is within the appropriate time period we've set to generate an alert.
As we've written here in our pseudocode, we demo a time period between the hours of 10am to 6pm (which is further customisable) rather than a wholly unpredictable time period as per the anonymous interviews in the above report. We note that interviewees referred to an alert potentially coming at any time, which made even completing everyday tasks like showering anxiety-inducing and stressful. The window of time we chose would prevent alerts from being sent to the device during the night, which would stop the person subject to the condition from sleeping or mean that they miss an alert thereby putting them at risk of breaching their immigration bail conditions.
We note that the time frame in the above code could be customised to even shorter windows of time - for example sending alerts only between 8am and 9am or 5pm and 6pm (or other similar windows). The timing of alerts can therefore be calibrated in ways that ensure they do not occur in inconvenient and stigmatising moments and between regular and foreseeable windows of time.
We also designed this function so that, if 'the time right now IS between 1000 and 1800', then we move onto the next 'IF', which checks if the device had only just sent an alert, such as less than one minute ago, then the device should not yet generate a further alert again, avoiding the erratic possibility of a large volume of alerts occurring around the same time. This is significant given that the NFDs generate random alerts, which in some cases may be leading to individuals reporting that they are receiving a large number of alerts in one day (up to 10 a day and always at random times in one case). We address the possibility of coding the software to limit the number of alerts altogether below in the main() function.
We also note that the 'less than one minute ago' parameter in the pseudocode can be customised to any time elapsed, such as one hour.
If we pass all these 'IF' cases in the above function ShouldWePing(), the device should then send an alert to the device ('RETURN true'). Otherwise, no alert shall be sent.
2. The SendPing() function
This SendPing() function is the function with the most activity and performs several key tasks:
- Allows the customisation of a response time for the person to scan their fingerprint (represented by the variable 'more than allowed'), which can be set to anything such as 5 minutes rather than a particularly short period of time, such as 30 seconds.
- Provides the individual with three tries to scan their fingerprint if their match score is deemed a 'FAIL'.
- Customises the pass/fail threshold for a match score (represented by the variable 'under threshold') so as not to be unreasonably strict in scenarios where a person might be trying to scan their finger too quickly within the short minute or perhaps due to sweat or grime on their fingers from working. The match() function called by this device in particular is an 'int', which in computer science terms means it outputs an integer between -1 and 127. The integer that match() outputs is a representation of the percentage match of the fingerprint, with -1 being the lowest percentage match and 127 being a near-perfect match. The SDK allows the developer to delineate what integer is to be the cut-off for a pass (represented in our pseudocode by the variable 'under threshold'). For example, we could set the cut-off integer to be 60, which means a match() output of 60 or greater will result in a pass for the fingerprint verification. The purpose of showing this customisable threshold-setting is that a developer has the capability to set a reasonable or a very stringent cut-off threshold, such as an integer of 126 or greater (out of 127) as opposed to 60 (out of 127) as a passing score. It is unclear whether the NFDs used by the Home Office rely on a match score and if so how stringent it is.
- Send the GPS location of the subject only at the moment they scan their fingerprint ('GET location' in the pseudocode), not constantly in the background 24/7. This is particularly significant given that we know that the NFDs collect and transmit trail data on a 24/7 basis in the same way that GPS ankle tags do. As we demonstrated in our technical research relating to ankle tags, the frequency with which devices collect and transmit trail data can be limited to certain times in ways that the Home Office does not appear to have considered. The possibility of GPS location being sent to the Home Office at the same time as a fingerprint alert (particularly given that these can be customised to certain times - see below) means that the system could be callibrated to monitor the subject at very specific and pre-determined times. For example, trail data could be collected once a month during a scheduled 'check-in' at the same time as the individual subject to the condition provides their fingerprint.
3. The overTime() function
This main() function simply calls all the above functions in the overall running of the app to pull all these features together. This function also includes a customisable volume of alerts per day, represented in our demo by 'maximum of 3 times', which the developer can set to any number of notifications over a given period of time. This can be calibrated to ensure a more reasonable volume of alerts than 5 or even 10 fingerprint alerts per day (as one interviewee reported). While our code allows for a maximum of 3 times per day, this could also be set to a maximum of once per day or even once per week or month.
The app
Pulling together all the above functions of the pseudocode, we developed a basic visual mockup of what the app that performs all these functions should look like in its implementation:
Note that in our design of the app, as written in the above pseudocode, we've encoded further transparency into the design of the display screen to encourage further transparency and disclosure to individuals of:
- The current time, so that they know when they are being alerted.
- A countdown timer for how long they have left to scan their fingerprint.
- Their GPS location detected at the time of scanning when they submit their fingerprint scan.
- Their match score so as to better understand what match levels they are getting and that are being sent to the Home Office.
- How many tries they have left to scan.
Inhumane with intention
What our pseudocode shows is that the existing technology for GPS fingerprint scanners have the capability to implement more humane practices that better respect individuals' human dignity, which is significant given that the tracking is already highly intrusive even absenting the specific design of the tracking system. Instead, the Home Office characteristically deploys technology without adequate consideration for the human rights impact on those subjected to monitoring.
Transparency by design
Our pseudocode contains both technical and design changes to the devices' functionality that underline that the detrimental impacts device holders have reported, such as social stigma and paranoia, are encoded into the very design itself. As above, these adverse impacts are likely to be exacerbated by the total lack of transparency around the design and functionality of the tracking that leaves subjects feeling that they are subject to the whims of randomised surveillance technology. 'Transparency by design' practices, such as displaying the person's passed or failed match score or how much time they have left to scan their fingerprint, allows device holders to be more informed about the outcomes of their scanning that are largely governed by the algorithm. This appears to be the opposite of the Home Office's current devices' approach.
Respecting human dignity
Our pseudocode exhibits how the technological design of the tracking system is particularly intrusive and stigmatising. This is despite the possibility to limit alerts to specific windows of time, as well as the number of alerts altogether (for example to 3 times a day or once a week).
It is clear from interviews with individuals subject to NFDs that an unreasonably large and unpredictable volume of alerts coupled with the lack of transparency around how many notifications should be expected at particular times of the day fosters 'constant alertness' that severely limits how device holders can operate and conduct their everyday lives.
The technology is also capable of establishing more reasonable timeframes for these alerts to happen, such as between 10am and 6pm (or even between 8am and 9am for example), not for unreasonably long windows of time, such as over a 12 hour time period or at times in the night while people are typically sleeping. These windows of time could even be customised on an case-by-case basis, for example following a request by an individual's legal representatives.
In the above-mentioned report, some interviewed individuals even mentioned having less than one minute to scan their fingerprints, which creates a cycle of systematic failures that work against the individual. For one, a person who is playing sports or working might need more than one minute to wipe their fingers or step aside to scan their fingerprints; these individuals may simply need another few minutes to collect themselves. Our pseudocode shows quite simply how the technology can provide a more reasonable response time (e.g., 5 minutes) and to display to the individuals a timer so they are informed of how much time they have.
Our research shows that the design of the technology also exacerbates social stigma and shame stemming from the imposition of GPS tracking generally. As highlighted in the interviews included in the above-mentioned report, the social stigma of a loud alert in public or taking the device out on the bus to quickly scan a fingerprint can be detrimental to the emotional and mental wellbeing of a person, causing feelings of perceived social stigma and shame in the individual. This is at odds with the suggestion implicit in the EIA conducted in relation to the deployment of NFDs that their purpose is to alleviate stigma stemming from fitted devices.
Limitations on live GPS tracking
The Home Office's devices currently track all the movements of the device holder at all times and notify them to scan their fingerprints as a way to verify they have the device on hand. As noted above, interviewees have reported feeling 'like a caged animal', as the alerts constantly remind them they are being watched everywhere they go. This exceeds the necessary means of monitoring 'low harm' individuals moved to NFDs. Constant alerts of as much as 10 times a day and the constant live location tracking exacerbates feelings of constant surveillance that discourage and dehumanise individuals.
Our pseudocode implementation shows that 24/7 GPS tracking is neither necessary nor required for the purposes of monitoring; instead of constantly tracking a person's GPS location in the background, which encroaches on the individual's right to privacy, the device might only log the person's location data when they scan their fingerprint, which is much more reasonable and just as effective for the Home Office's purposes of monitoring particularly with reference to current information about the rate of absconding by individuals on immigration bail. According to FOI requests submitted by Bail for Immigration Detainees, the rate of absconding for people released from immigration detention was 2.7% in 2021 and 1.3% in the first six months of 2022. There has been no effort by the Home Office to take into account the particular circumstances of a given case by for example reducing the intervals at which an individual (particularly where they have a strong history of compliance with immigration bail conditions) is subject to GPS and fingerprint tracking.
Conclusion
Non-fitted devices were supposedly introduced by the Home Office as a 'less intrusive' alternative to ankle tags, but as reported by those subject to them in reality the devices are no less stigmatising or detrimental than fitted tags and in fact even introduce different harms unique to the technology. Our technical research showcases that the NFDs deployed by the Home Office are configured in a particularly intrusive way. This is exactly in line with the findings of our previous technical research on ankle tags, which demonstrated that the fitted devices deployed by the Home Office are set to collect location data at the most intensive rate possible. This is a concerning finding, as we reiterate that the Home Office has an ongoing duty to ensure that the technology they deploy complies with data protection and human rights law. But as our research above has shown, the onerous design of the NFDs currently deployed by the Home Office do not so readily meet these considerations.
From what we have seen reported and what we know is technically possible, the Home Office has either not considered or disregarded the possibility of configuring fingerprint and locational tracking at specified times rather than on a 24/7 basis. The same is also the case for setting fingerprint alerts at more reasonable and foreseeable time windows and give individuals being tracked more time to provide their biometric information. More widely, this is indicative of an immigration system built on data exploitation and surveillance.
The human cost are individuals who are made to feel like 'caged animals' - constantly thinking about when the next alert might go off - when they are on a bus, seeing friends, or trying to sleep.
We must see the Home Office make more humane decisions as to how they use new technologies to ensure that they do not further dehumanise migrants but instead take a deliberate approach to protect and respect their right to live in dignity and free from stigma and arbitrary surveillance as they navigate an increasingly hostile environment.