OP-ED - My former teacher wants to put a conscience in AI. I have thoughts.
Smart legal contracts might be AI's best guardrail. But the system they're designed to protect was never neutral to begin with.
Aussie emerging technology lawyer Natasha Blycha taught me English at a boarding school outside Gweru, Zimbabwe, around the year 2000. She reignited a love for reading I'd let go of.
Blycha recently joined me on the African Tech Roundup Podcast to discuss smart legal contracts, AI sovereignty, and why the law as currently designed cannot cope with artificial intelligence.
She was brimming with insight. The conversation is worth your time.
But something she said early on has stayed with me since we hung up. Blycha cited her time in Zimbabwe — working part-time for a small law firm on constitutional questions during the land invasions — as the experience that shaped her international legal career.
"What happens to a country if there isn't foundational law working really well?" she asked. It is the question that drives her work embedding enforceable contracts into AI systems, building legal guardrails before the technology makes them impossible to retrofit.
I understand the question. I also think it requires a longer answer than the one she gave.
Whose breakdown
The land invasions Blycha witnessed were the tail end of a story that began decades earlier. Before disgruntled war veterans backed by President Robert Mugabe occupied white-owned farms, before the currency collapsed and the fuel ran out, there was Ian Smith. The occupations peaked in 2000-2001 after a failed constitutional referendum, precisely when Blycha was in the country.
On 11 November 1965, Smith (whose family home in Harare's Alexandra Park suburb sat around the corner from the company house my father's employer provided when we lived there in the mid to late 1980s and early 1990s, not the kind of neighbourhood too many middle-class Black Zimbabweans could afford to buy or rent into on their own terms) declared Rhodesia unilaterally independent from Britain.
The UN imposed sanctions. It made no difference. The entire Rhodesian state that followed was built on a legal architecture designed to exclude the Black majority from land ownership, political participation, and economic self-determination.
International law did not intervene to correct that. Well, at least, not meaningfully and not in time. The "rule of law" that Blycha mourns breaking down in Zimbabwe was, for most Zimbabweans, never fully functioning on their behalf in the first place.
I raise this not to diminish her groundbreaking work, which I genuinely believe is among the most important contributions being made to responsible AI governance globally, but because it matters whose version of the breakdown we count from when we build the rules for what comes next.

The code can't answer for itself
Blycha's central assertion is that AI systems are becoming economic stakeholders the law cannot hold accountable. You cannot put code in jail.
The legal frameworks underpinning everything from property rights to financial regulation were designed for human bodies and human intentions. When those frameworks meet autonomous systems operating at scale, they fail.
She is right. And the contradiction is blistering. The international legal order she wants to reinforce for the AI age is the same order that, as Canadian Prime Minister Mark Carney laid out at Davos last month, is fracturing precisely because it was never as neutral as advertised.
Middle powers have been performing sovereignty while accepting subordination. Carney well could have been describing most of post-independence Africa.
Whose rules get embedded
So when Blycha proposes embedding smart legal contracts into AI devices — contracts that act as referees, stopping machines when they breach their own rules — the technology is compelling, the ambition is admirable, and the question it raises for Africa is: whose rules get embedded?
If the legal architecture of the AI age is built primarily by Western institutions, using Western precedents, enforced through Western infrastructure, then the land grabs of the 21st century will not involve farms. They will involve data, compute, and the contractual terms baked into every device, platform, and autonomous system operating on the continent.
My paternal grandfather, Ndumo, was dispossessed by Smith's rural rezoning policies — forced into what were euphemistically called "Native Purchase Areas" so that his land could be repurposed for white-owned commercial farming. The mechanism was legal. The outcome was theft dressed in policy language.
The same dynamic is being encoded into digital infrastructure, only this time the rezoning is algorithmic and the title deeds are written in code.
Citizens of the Global South will live inside those terms the way my parents' generation lived inside legal frameworks they had no hand in drafting.
Blycha asked: what happens when the rule of law breaks down? The harder question is what happens when the rule of law was never framed with you in mind to begin with — and the people writing the next version are moving fast.
Editorial Note: A version of this opinion editorial was first published by Business Report on 10 February 2026.
