The System in Place

The federal government’s lofty immigration goals, the COVID-19 pandemic creating a backlog of applications, and a large number of applications filed daily have pushed IRCC to modernize a previously paper-based system. Despite an increased use in decision-making technology, automation is not new.

Application Automation

Since 2015, most visa-exempt travellers were issued an Electronic Travel Authorization (eTA) through an automated system. Piggybacking off that success, in 2017 the IRCC conducted a pilot project that used AI to triage and approve low-risk temporary resident visas from China. These applications were sorted into two tiers: low-risk, which were auto-approved, and medium-to-high risk, which were reviewed by an officer.

Chinook

IRCC created an excel-based software named Chinook to help sort and bulk-process applications. When using this application, officers are assigned a set number of applications, which had data pre-extracted and organized onto a spreadsheet. Officers can then review multiple applications on a single screen, and flag anything concerning.

Since being introduced in 2018, there has been an increase in refusals from certain countries. For example, refusals for study permits from India increased by 23% between 2018 and 2020.

Beyond the increase in what Chinook would consider ‘undesirable applications’, critics have raised concern about its application. Many lawyers and applicants can not understand, let alone challenge, the reasons for some auto-rejections. This can be seen in the landmark Ocran case. Not only was Ms. Ocran’s case the first insight into the inner workings of the program, but her lawyer also argues that the rejection would not have occurred if an officer had handled her file.

What is the New System and How Will it Work?

With the IRCC searching for a way to streamline their system, in 2021 the Ministry of Immigration has pledged $428.9 million dollars over five years to replace their case-management system. Although it has yet to be named and the details have not been made public, the department’s digital policy team created a 56-page automation playbook outlining its developmental objectives.

The document suggests that the IRCC should allow the use of deep-learning algorithms, which can be unpredictable and work “outside the scope of meaningful scrutiny and accountability”. Though this technology is not meant to solely determine whose application will be accepted, facial recognition may be implemented.

Although the playbook states that the IRCC should be transparent with its use of AI, limited information should be shared to prevent applicants from taking advantage of the system. The option to opt out of automated processing should not be available either. However, to offset any discriminatory practices, the algorithm is expected to be tested against existing human rights laws.

While it may be comforting to know that the playbook suggests that humans should be included in the decision-making process, and be accountable for manually reviewing the cases, it seems like that principle is not practiced with the current system.

What Critics are Saying

Arghavan Gerami, Founder of Gerami Law PC, says she has a problem with introducing AI into the immigration system. “It’s not fair” she says, “an independent officer is not exercising discretion”. Petra Molnar, an expert in Canada’s approach to automation, feels similarly to Gerami.

Despite the potential to make the system more efficient, Molnar says we “missed the step where we’ve had a conversation as a society about whether or not we’re okay with automating certain aspects of a high-risk decision-making system like immigration”. Finding a balance between discretion and automation will be difficult.

University of Calgary Professor, Gideon Christian, also debated this topic in the House of Commons last month. Alongside the drop in approval ratings from Indian countries, there has also been a drop in student visa approvals from African countries. He fears that “racism and discrimination is evident from a human review of this application. If we train artificial intelligence technology using this data, we’re going to have a regurgitation of that same problem”.