Canada is using AI to speed up immigration applications — but it comes at a cost, experts warn

Oct 2 2025, 5:53 pm

Immigration, Refugees and Citizenship Canada (IRCC) has adopted technology to speed up application processing, but experts warn it may be creating more problems than it solves.

Mario Bellissimo, an immigration lawyer and founder of Bellissimo Law Group PC, described the IRCC’s use of AI as an “inevitable” and “commendable development.”

The principal advantage lies in efficiency. Processing times have been reduced, routine cases are triaged more effectively, and officers are able to concentrate their efforts on more complex matters,” he said. “For example, automation in spousal sponsorships and temporary resident visa applications has significantly decreased assessment times.”

But although automation has helped reduce processing times, these gains raise issues about fairness and potential bias in decision-making.

Efficiency vs. fairness

ircc

Mario Bellissimo (bellissimolawgroup.com) | Petra Molnar (Submitted)

However, Bellissimo points to some serious issues, such as an over-reliance on technology.

“One notable example involved refusal reasons being generated at the exact same timestamp as the application was processed, raising legitimate concerns about whether any substantive human review had occurred,” he said.

In an email to Daily Hive, an IRCC spokesperson stated that “no final decisions are made by artificial intelligence” and that tools don’t refuse or recommend refusing applications.

“Immigration, Refugees and Citizenship Canada (IRCC) uses advanced analytics tools to help sort applications based on set rules,” they said. “These tools rely on machine learning to spot patterns in past decisions.”

But it’s precisely these patterns that Petra Molnar questions.

The author of The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence and an associate director at the Refugee Law Lab at York University, Molnar warns that relying on past decisions can perpetuate bias and lead to unfair outcomes.

“Technology is not neutral,” she said.

She added that the use of technology without proper oversight creates a risk of oversimplifying complex cases, such as those involving refugees and asylum seekers.

“Automation also replicates structural discrimination and systemic racism, and applicants are rarely told how decisions about their lives are being made,” she said.

IRCC maintains that advanced analytics tools are continuously reviewed by employees who ensure that results are consistent with applications that receive a full human review.

“Identifying and mitigating potential bias or discriminatory impacts is always a critical focus as we design, develop, and implement technologies,” states the IRCC website.

Calls for transparency

ircc

Evgenia Parajanian/Shutterstock

Experts also highlight the lack of transparency in how IRCC uses technology to process applications.

Bellissimo recalled when the public first learned that IRCC had introduced tools like Chinook without disclosure or consultation — information that only came to light through Access to Information requests. However, in a 2022 statement, IRCC stated that Chinook only extracts information from applications to create a clear and user-friendly format for officers.

Belissimo has long advocated for greater transparency.

In 2022, he submitted a brief to the House of Commons Standing Committee on Citizenship, proposing measures such as passing laws governing AI use, establishing specialized training for staff, and implementing independent external audits with enforcement powers.

The essence of the brief was that without legislated guardrails, AI risks replicating historical patterns of discrimination within Canadian immigration,” he explained.

Belissimo said that it’s about ensuring that technology is used responsibly and that immigration advocates should be treated as partners rather than watchdogs.

Molnar also notes that Canada is increasingly relying on private companies for automated immigration and border management tools. She pointed out that these partnerships are opaque, with undisclosed contracts that make it almost impossible for lawyers and applicants to find out how decisions are made.

“Canada could be a leader in rights-based innovation, but right now, the use of automated decision-making systems in immigration looks more like an experiment on vulnerable populations than a fair or just reform to a system that desperately needs it,” she said.

ADVERTISEMENT