Eticas

High-performance AI without the risks. 

Chinook: an algorithm to help migration officers in Canada decide who to let in

Chinook: an algorithm to help migration officers in Canada decide who to let in

The Russian invasion of Ukraine, started on 24 February 2022, has provoked one of the single greatest refugee crisis of recent times. By 24 April 2022, there were more than 5.2 million refugees who had fled the war in Ukraine, according to data compiled by the UN High Commissioner for Refugees (UNHCR). Do you know how much a migration algorithm influences this?

And even before the invasion of Ukraine, by mid-2021 the UNHCR estimated the total number of refugees in the world was of more than 84 million. That figure started to increase dramatically after 2011, and in Europe the relatively large number of migrants arriving on European soil in 2015 was labelled as a “refugee crisis”.

Governments are looking at algorithms as one way to help them manage the increasing numbers of people seeking asylum and generally migrating into their countries (when someone is granted asylum that’s when they are legally recognised as refugees).

An interesting case is Canada, which since at least 2014 has been experimenting with automated algorithmic systems to try to increase the efficiency of its immigration system. A general idea, as it’s often the case when public administrations introduce the use of algorithms, was to automate some of the tasks conducted by immigration officials. But this idea becomes problematic when some of the tasks carried out by the algorithm may include “whether an application is complete, whether a marriage is ‘genuine’, or whether someone should be designated as a ‘risk’”.

Reportedly, in the case of “straightforward applications, the system approves eligibility solely based on the model’s determination, while eligibility for more complex applications is decided upon by an immigration officer”. In theory, all applications had to be reviewed by an immigration officer before being final.

Here it needs to be noted that people who may have their applications turned down in part due to such algorithmic decision will be some of the most vulnerable people on Earth: those fleeing war and other violent dangers and seeking refuge elsewhere.

There’s a general lack of transparency about what algorithms Canada is using and to exactly what, and of detailed data about the impact of the use of such algorithms. However, a judicial case in 2021 forced Immigration, Refugees and Citizenship Canada (IRCC) to disclose information about Chinook, one of the algorithmic tools it’s been using since 2018, in this case to bulk-process not asylum claims but immigration and visit applications.

Since the introduction of Chinook in March 2018, reportedly at least student visa refusals increased significantly, particularly in some cases, with the student permit refusal rate for India going from 34% in 2018 to 57% in 2020.

To find out whether there are good and legitimate reasons for such an increase in the number of student visa applications being denied by the Canadian government, we should know how Chinook works: the algorithm should be transparent and explainable. As should be any other automated algorithmic systems used to decide whether a person fleeing violence can get asylum, or whether someone wanting to migrate to look for better life prospects can stay in the country of their destination.

Generally speaking, algorithms being used in decision-making processes that impact on people’s lives should should be made auditable by external and independent parties.

You have more information about Chinook and many other algorithmic systems in the OASI Register, and you can read more about algorithmic systems and their potential social impacts on the OASI pages. Besides that, if you know of an algorithm that we haven’t yet included and which can have a social impact, you can let us know.