The Hostile Office

The Digital Hostile Environment

surveillance camera

The digitisation of the ‘Hostile Environment’ in the UK is growing. Migrants are being used as a testing ground for the State’s digital surveillance plans. Along with our friends at the Open Rights Group and Big Brother Watch, we are committed to challenging the expansion of the digital border and attack on migrants’ data rights.

In recent years, the Government has increased technological surveillance. By 2024, the Home Office has said it plans to be operating a “fully digital system” for all migrants. 

Data sharing

A key component of the digital Hostile Environment is a network of data sharing agreements between different public agencies that allow the Home Office to access people’s personal information for immigration enforcement purposes. 

In April this year, the Immigration Minister, Robert Jenrick MP, announced plans to restart data sharing with the financial sector. This is part of the Government’s effort to stop undocumented migrants accessing banking services. The policy of data sharing with banks was previously introduced in 2016 under the Immigration Act, but was paused in 2018 due to the Windrush scandal. 

The Government has begun storing the details of all migrants in a centralised database, which has already been made available to the Department for Work and Pensions, Revenue and Customs, NHS trusts, and businesses and private individuals who are employers or landlords. Banks and building societies will then check personal current account holders against details of disqualified people shared by the Home Office. 

There has been little transparency around who is entitled to access the data or how decisions to expand access in the future will be made. Access to these databases by outside individuals presents a real risk of companies and private individuals using it to profile, manipulate and discriminate against migrants. 

The Home Office has proven untrustworthy with personal data before, and mistakes in peoples records can create significant delays or wrongful denial of access to vital services for migrants. An independent investigation in 2016 where they discovered an error rate of 10%. They found people listed incorrectly as ‘disqualified persons’ from a sample of accounts (ICIBI, 2016).

Technology is reinforcing inequalities

The increased use of technology in immigration enforcement has, and will have, a disproportionate impact on People of Colour and people from migrant backgrounds. Biometric tools, including facial recognition software, are becoming increasingly common in the UK’s immigration system. The Home Office is currently developing a ‘Biometric Matcher’ platform, which will allow police and immigration officers to conduct biometric searches across fingerprints and facial scans held on a vast joint database. This platform could enable the monitoring and tracking of migrant communities at an enormous rate. We are increasingly concerned about the impact facial recognition software is having on migrant and racialised communities.  

Research shows that AI technologies have serious vulnerabilities, are systemically biased, and are applied in ways that exacerbate racial inequality. These programs are likely to have a disproportionate impact on marginalised communities who already face increased risk of immigration stops and right to work checks. Facial recognition software is often error-prone or in some cases, has failed to pick up people with darker skin tones. 

Accuracy is a huge problem, 89% of live facial recognition matches on deployments by the Met and South Wales Police have been wrong since they implemented the technology. In a report compiled by Big Brother Watch, they found there is serious racial bias in how operator-initiated facial recognition (OIFR), which is a facial recognition search run through a police officer’s mobile device, is being used by South Wales Police: people from a non-White background are almost four times more likely to be subjected to an OIFR scan than White people. People of Colour are also more likely to be subject to fingerprint scanning and stop and search. 

The increasing use of biometric surveillance risks automating the inequality and discrimination already baked into the immigration services and policing. Decisions made by technology are often opaque and difficult to challenge, creating additional barriers to justice for those in the immigration system.

Right to work checks have gone digital which is causing a wide range of issues for migrant workers. The checks are error-prone and opaque about the collection and sharing of migrants’ personal data. The risks posed by incorrect data records and technical failures include preventing people from taking up employment and unfair dismissal, resulting in a lack of income and sometimes destitution. You can learn more about how we’re challenging right to work checks through our Challenge the Checks campaign.

Digitisation of the border

In July 2022, the then-Home Secretary, Priti Patel MP, announced plans to begin the rollout of “secure contactless border crossings” as part of the New Plan for Immigration. The pilot will begin testing in 2024. Passengers will undergo pre-screening and be identified at the border using the “latest technology”.

By the end of 2024, people who do not need a visa to enter the UK will need to have an Electronic Travel Authorisation (ETA) before they visit.‘Non-visa nationals’ include EU and US citizens and excluding British and Irish citizens. The new system will require visitors to apply and pay for the ETA prior to travel. It will be a digital record linked to a person’s passport and confirmed by email.

The UK’s Immigration Rules will require people to apply online or through a mobile app. The form will ask for a photograph, biographical and contact information, passport details and information on criminal offences and immigration history. The Home Office eventually wants people to provide fingerprints and the department has been running feasibility trials of fingerprint self-upload technology known as biometric self-enrolment feasibility trials.



Open Rights Group briefing: How the DPDI Bill harms migrants’ data rights

Big Brother Watch Biometric Britain: The expansion of facial recognition surveillance

Scroll to Top