Connect with us

Canada News

Before Canada spends billions more on defence, let’s control the killer robots

Published

on

This year diplomats have been exploring ways to use the UN’s Convention on Conventional Weapons as a vehicle for setting new limits on AI-assisted military technology. While these initiatives are important, there are still no binding international laws. (Pexels photo)

As Canada moves quickly on artificial intelligence while also dramatically increasing its defence spending, there is good reason to think that these trends will converge on what many governments believe is the “killer app” of 21st-century warfare: AI-assisted weapons systems.

Earlier this year Prime Minister Mark Carney sent a clear signal that AI has become a priority file for Ottawa when he created the cabinet portfolio of Minister of Artificial Intelligence and Digital Innovation. Then this summer the government signed a memorandum of understanding with the Toronto tech company Cohere to identify where Canadian-built information systems could improve public services.

The direction here is unmistakable. The state is shifting from talk to action in exploring broader integration of AI, which will inevitably shape choices in national defence where the consequences are life and death.

Canada’s defence and AI priorities are converging

Those moves are coming at a time when Canada is making new commitments to vastly increase its military capacity. In June, NATO allies agreed to raise their own defence and security spending targets to five per cent of GDP by 2035. The largest sustained increase in Canada’s military spending since the Cold War will more than double Canada’s current defence budget, potentially reaching more than $100 billion annually by 2035.

These aren’t just numbers on a balance sheet; they’re choices about what kind of military Canada builds and what values are embedded in it. The spending will decide what the Canadian military buys, how it integrates software into weapons and command systems, and where human judgment fits into the chain of decision-making that can take a life. If Canada becomes complacent about human oversight, algorithms will write rules instead of people.

The potential consequences of crucial military decisions are not merely theoretical. Last month Israeli Prime Minister Benjamin Netanyahu described as a “tragic mishap” a strike on a Gaza hospital that killed 20 people, including health workers and journalists. That’s the kind of instantaneous, devastating “error” that proponents claim AI systems can prevent, but evidence from Gaza shows precisely the opposite.

Gaza and Ukraine reveal the dangers of algorithmic warfare

Multiple investigations have reported that Israel uses AI-assisted tools — bearing names like “Lavender” and “Gospel” — to generate military target lists and prioritize attacks. These systems mark large numbers of people as suspected Hamas members based on algorithmic profiling — i.e., proximity to suspicious locations, contact with flagged individuals, social media interactions — and feed their names into a faster targeting cycle.

Most disturbing is how human oversight of these deadly AI-guided exercises has been eroded to a mere 20 seconds per decision. Israeli intelligence officers admit spending just 20 seconds signing off on individual Lavender-generated strikes, despite knowing that the system misidentifies targets in approximately 10 per cent of cases. Israeli soldiers involved in the process have described their roles as “rubber stampers,” effectively delegating life-or-death judgments to algorithms that are obviously incapable of nuanced judgment, moral reasoning or contextual understanding.

As this accelerated decision process bypasses critical human reflection, commanders are raising the thresholds for permissible civilian harm and authorizing attacks — up to 20 civilian casualties for every militant targeted. Rather than the promised precision, AI-driven operations have become synonymous with humanitarian catastrophe, highlighting a reality where speed and efficiency take precedence over human life and accountability.

In Ukraine, a United Nations commission concluded in May that Russian forces committed crimes against humanity by using drones to attack civilians in the Kherson region. Human Rights Watch documented quadcopter drone attacks on people riding bicycles or engaged in other normal activities. It is the kind of close-up violence that terrorizes people going about their everyday lives, and shows what the scourge of technology-determined capability looks like when it reaches innocent civilians.

How Canada can lead with safeguards and human oversight

While global rules for confronting such incidents are lagging behind the pace of atrocities, there are signs of progress. For instance, last December the UN General Assembly voted 166-3 to adopt Resolution 79/62 on lethal autonomous weapons. This year diplomats have been exploring ways to use the UN’s Convention on Conventional Weapons as a vehicle for setting new limits on AI-assisted military technology. While these initiatives are important, there are still no binding international laws.

Canada does not need a global treaty in order to act responsibly. Our military already uses Geneva Convention guidelines to conduct legal reviews of new weapons, a process that also checks compliance with the laws of war. But we must ensure that any such process clearly defines and quantifies required levels of human control before our military procures weapons systems that move faster than our values.

For instance, the government can require that all potentially lethal actions remain subject to human decision-making, allowing sufficient time and information to halt the operation. Ottawa can also require that weapon vendors submit event logs, model versioning, and thorough explanations to independent reviews whenever things go wrong. If a supplier cannot meet that standard, the system should not be deployed.

Lessons from the American 1950s defence boom for Canada

Greening the military: Why defence spending must align with climate action

The NATO pledge to more than double defence spending only makes these criteria more urgent. If Canada is preparing to invest at levels not seen since the 1950s, the military must keep humans firmly and accountably in charge. The clearest line Carney can draw is also the simplest: Canada will not use AI-assisted weapons that can select and attack human targets without a human decision. The government can adopt that policy at home, and advocate for it in NATO and at the UN.

Domestic policy points in the same direction. The government’s deal with Cohere shows that Ottawa intends to use AI, so it should weave that momentum into a transparent, public directive on military AI that sets testing requirements, engagement controls, and accountability mechanisms. And we should publish those standards in order to help inform and maintain public consent as defence spending rises, and send Canadian firms and allies a clear signal about the safeguards expected.

Protecting the ‘right to hesitation’ in warfare

There is another crucial principle at stake that also deserves plain language. Democracies depend on the ability to pause, to weigh risks, to accept responsibility for using force. Scholars call this ‘the right to hesitation’ — human beings being given the time and space needed to properly deliberate before making decisions that contribute to violence.

Designing deliberation space into systems is not weakness. It is discipline, and it is how we draw a line between restraint and catastrophe. Proponents argue AI can reduce human error, but the evidence from Gaza shows how algorithmic bias and speed can amplify mistakes rather than prevent them, creating new forms of deadly error that occur at machine speed but leave human-scale devastation.

Canada is right to modernize, and to cultivate domestic AI capacity. It is also right to insist that humans remain in command when lives are at stake. Gaza and Kherson are warnings, not templates.

Canada is well positioned to lead by example, insisting on clear red lines and practical controls that keep human judgment at the centre of any use of force. If we do, we will be better allies and a stronger democracy. If we do not, we risk waking up in a world where the space for ethics has been engineered out.

This article first appeared on Policy Options and is republished here under a Creative Commons license.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Maria in Vancouver

Lifestyle1 week ago

The Real Rich

Margaret Atwood aptly captured this dynamic with the phrase, “Old money whispers, new money shouts.”  Let me elaborate on this...

Headline3 weeks ago

Love in the Afternoon of Life

Love in later life—the 50s, 60s, 70s, and beyond—is a thriving, fulfilling reality. It offers companionship, improved well-being, and joy,...

Headline4 weeks ago

Your Most Important Relationship is With Yourself

Valentine’s Day shouldn’t be celebrated only for one day. Love should be celebrated everyday. Valentine’s Day, when expanded beyond romance,...

Headline2 months ago

The 2016 Trend Made Me Reflect On My Past & Present

Like many others, I couldn’t resist joining the 2016 throwback trend.  It was all over social media, with everyone sharing...

Headline2 months ago

How To Be Healthier Realistically

It’s a brand-new year and a brand new you! If you’re like me who had been indulging quite a bit...

Headline3 months ago

Celebrating The Spirit Of Christmas

For many people, Christmas is the loneliest time of the year — it could be due to the fact that...

Headline4 months ago

Fun Facts About Christmas

It’s definitely beginning to look and smell a lot like Christmas! The beautiful thing about Christmas is that it’s mandatory...

Lifestyle4 months ago

How To Keep The Music Playing

You and your partner or spouse have been in a long-term relationship. Somehow, over the years, the fizz has fizzled...

Headline4 months ago

Declutter Your Life

There will be days when we feel like too much is going on around us — too much unnecessary noise...

Health5 months ago

A Healthy Mind Matters

Like the rest of the world, I was deeply saddened and shocked when I read that TikTok influencer, Emman Atienza...