Where do refugees’ personal data end up?

Personal data of refugees and Data Protection in the migration context is increasingly relevant, especially with the refugee crisis and war in Ukraine in mind and the emergence of new technologies such as AI.  

We are increasingly finding that some European countries assess refugee and migrant applications by looking at their social media and digital footprint.  

Moreover, several border authorities are asking asylum seekers to hand over passwords for social media accounts and cell phone data.  

 

Also, the use of AI systems is popular in certain European countries. Such systems can be used for emotion recognition, problem analysis in shelter camps and labeling certain individuals as a “risk.” Biometric data collection is also still happening.

Incidents involving data theft, data loss,disclosure and unauthorized or inappropriate use of personal data in the migration context are therefore more common than they should be.

While data sharing is beneficial in cross-country comparison and evidence-based policymaking, and despite the fact that there is an increasing demand for more frequent and timely migration data, the right to privacy is a human right and data protection is a priority.

In the blog, we will explore the extent to which the GDPR is respected when it comes to asylum and migration.

What data is collected from asylum seekers?

Asylum seekers are expected to provide a large amount of personal information. Personal data is requested at various stages of the asylum procedure. The amount varies from country to country.

Upon their arrival in most European countries, asylum seekers will be screened, photographed and have their fingerprints taken.

Asylum seekers go through the reception and identification process, an eligibility assessment, full registration and an asylum interview to qualify for refugee status.

When asylum seekers travel to other countries, their personal information and biometric data are often shared with local border authorities to validate their identity. Usually the personal information includes their name, age, gender, nationality. The biometric data is usually obtained by facial image, fingerprint scan or iris scan. UN agencies involved in the asylum process collect, share and store this personal and biometric data for refugee registration and aid distribution.

Legislative developments and practice application

Moreover, various border authorities are asking refugees for passwords for social media accounts and cell phone data, depending on national laws. Currently, Denmark and Germany have laws allowing the extraction of information from electronic devices. There was also talk of introducing this in Belgium for a long time.

A law had long been on the table that would give the CGRS (Commissioner General for Refugees and Stateless Persons) the right to request access to cell phones, laptops and social media profiles of asylum seekers. That adjustment to the Aliens Act came about in November 2017 on the initiative of then-Secretary of State for Asylum and Migration Theo Francken (N-VA).

The law has not been applied in practice until now because the necessary RD was never drafted for the CGRS to define the precise details of the measure. This does not mean that the CGRS does not look at asylum applicants’ social media, public profiles and data are indeed considered when assessing an application.

Data protection questions and remedies

It is a good thing that Belgium has put these plans aside but in other countries happent it still al too often. This is very problematic. Telephone data after all lead to discrimination, especially if refugees’ perceived social media content appears unfavorable to authorities because they represent opposing political views. Either way can these data cannot be a verifiable way to reject an asylum seeker’s application because they can be misinterpreted and falsified by others. Questions therefore arise about the remedies available to refugees with respect to their
data protection

can invoke from authorities.

The complexity of AI systems in migration: The limitations and biases of algorithm-based risk assessments

In addition to social media monitoring, it is also important to take a closer look at the use of AI systems in migrationcontext. The first thing to consider is the algorithms of AI systems designed to estimate the risk level of a particular person, this should prevent people with certain criminal backgrounds from obtaining refugee status or being allowed to migrate into the country.

Proponents of the use of such AI systems will argue that the systems are more objective and will base their risk assessment less on bias. Nothing could be further from the truth, however. The algorithms of the AI systems werden namely created by people and rely on an immense amount of statistical data, generated by decisions already made in the past.

As a particular group low-skilled from country X generations were denied entry to a particular country for a long time, based on this situation, a large amount of static data are present to feed the algorithm. This algorithms will in turn speed up people from country X as risky when they want to enter the EU.

A view of the reception camp on Samos, Greece

At the reception camp on the island of Samos in Greece. Are the refugees constantly watched by cameras, and the footage is then forwarded to a control room in the Greek Ministry of Migration. An AI system is responsible for the analysisren of thethem images and warns of potential problems. The right to privacy is completely taken away from asylum seekers in this camp.

Does the ai law provide the solution?

To date, there is no regulation around AI. With its AI act, the European Union would be the first global player to regulate AI. This AI act, when in operation, will enters, acting as an equivalent of the GDPR in terms of AI. In the European Commission’s initial proposal there was too little attention paid to migration issues, for example, no ban was initially provided for the use of emotion recognition and the phenomenon of predictive analytics using AI was not even included. Nothing was included about the use of AI for control and surveillance either.

Prohibition of intrusive and discriminatory use of AI systems in the current draft of the Act

During vote of the law in the European Parliament were meanwhile, eliminated some of these points. For example, the current draft of the Act prohibits intrudes and discriminatory use of AI systems, this meant that the following technologies may no longer be used upon enactment of the Act:

  • Emotion recognition technologies, which explicitly extend to European borders. Emotion recognition technologies purport to detect people’s emotions based on assumptions about how a person behaves when they feel a certain way, among other things, in order to enhance their
    assess credibility
    .
  • Biometric categorization systemsthat use personal characteristics to categorize people and draw conclusions based on them. This type of software is used to
    determine whether a person’s dialect corresponds to the region
    where he/she claims to to come from.

  • Predictive control systems
    , which use biased ideas about who might be considered at risk, to
    make decisions about controlling particular groups
    and spaces.

GDPR issues

It has become quite clear throughout this blog that refugees and migrants have very little control over their personal data. They often have no idea which of their personal data is being processed and what actually happens to it.

The GDPR requires that personal data be collected and processed only for specific, explicit and legitimate purposes. The data must also be accurate and kept up-to-date. Asylum seekers must provide a large amount of personal data, including biometric data such as fingerprints and facial images.

This data is then stored in a central database, which various government agencies can access.

The importance of transparency and informed consent

Article 5 of the GDPR requires individuals to be informed about the processing of their personal data and their rights under the regulation. In the
European

asylum system
this is all but guaranteed. In doing so, Article 5 requires
that personal data is shared only for specific purposes and with specific recipients.

Still
too often data is shared between different government agencies
without the

proper authorization
was given, think for example of the data from the reception camp in
Samos

which was submitted to the

Greek Ministry of Migration

are forwarded.

Conclusie

We can conclude that the personal data of migrants and refugees in Europe are inadequately protected against unauthorized access, disclosure and destruction. 

We can only hope that the AI act does go into effect and impose a restriction on the use of AI. To this day, European agencies in charge of migration, such as Frontex, are still investing heavily and using new data-driven control technologies.

 If the use of such technologies is not better controlled, the future of refugees and migrants in Europe will not look bright  

For more information regarding the application of the GDPR in the context of asylum applications, contact an accredited DPO.  

Delen:

Meer berichten

nis2 incident aangeven

To report an NIS2 incident

With the introduction of the NIS2 directive in the EU, cyber incident reporting will become mandatory for many companies. This means that

Partners

©DPO Associates Alle rechten voorbehouden. Privacy verklaringCookie verklaring | Algemene voorwaarden