By Claudiu Popa, University of Toronto, The Conversation
News of the Canadian government’s sudden decision to block TikTok from running a business in Canada landed with a thud last week.
Since then, navigating media coverage in search of concrete information feels like a wild goose chase. There is a definite lack of clarity behind the claim of mysterious “national security risks.”
Industry Minister François-Philippe Champagne is well known for his efforts to loosen the North American economy’s ties to China. When pressed by the media for details on how Canadians should interpret the decision, he simply said that Canadians will have to “draw their own conclusions.”
While that answer is opaque, back in 2023, TikTok pre-emptively created a Transparency and Accountability Center to offer authorities a behind-the-scenes view into their algorithms and content moderation practices, even as American lawmakers pressed the company to disclose its information access and processing practices.
TikTok transparency efforts
Last year, when Canada announced a ban of TikTok on government devices, I supported the move as an expert in fintech cybersecurity, asking why any work devices had access to distracting social media applications in the first place.
TikTok has further offered transparency through Project Texas, a program to relocate data to American servers and undergo third-party audits. Canada, however, has not engaged in or acknowledged such transparency efforts, possibly bypassing a co-operative solution in favour of more drastic restrictions.
I’ve never been a TikTok user and have no more interest in the platform than I have in the outfit formerly known as Twitter, with its well-documented content moderation challenges. But from where I sit, the Canadian government’s handling of TikTok raises critical concerns beyond content moderation, from reliance on secrecy to potential human rights implications.
Claiming — without offering any discernible evidence — that national security risks are so severe that they can’t even be shared with the public means citizens are essentially being told they can continue using the app but at their own risk.
Such an obvious appeal to fear, uncertainty and doubt seems to be intentionally crafted to create cognitive dissonance. It not only reinforces an authoritarian stance but more importantly erodes everyone’s understanding of security, risk and privacy.
Secrecy: Security by obscurity
By opting for a secretive national security review, Canada has avoided releasing specifics about the alleged risks. Such actions set a dangerous precedent, promoting a “guilty until proven innocent” mindset. This opaque approach could also foster a chilling effect, dissuading foreign investment in Canada, especially in digital sectors.
The secrecy surrounding this decision raises questions about its underlying motivations. It suggests a potential inclination toward controlling information of public interest rather than sharing it with stakeholders.
Whether this was intended to send a message to other Chinese companies in Canada remains to be seen, but such firms currently operate in retail, e-commerce, banking, energy and resources sectors and are no doubt closely watching the proceedings. That’s particularly true given five other China-linked companies in Canada have been unceremoniously shut down in the past two years.
As a result, it seems more likely than not that Canadian companies operating in China — like Magna, Bombardier, Saputo and the Bank of Montreal, among others — may soon face some retaliatory headwinds when it comes to doing business in the Asian country.
Setting a risky precedent
If it’s censoring a platform primarily due to foreign ownership, Canada could be setting a precedent that threatens global standards for internet freedom. Such actions risk empowering governments worldwide to impose restrictions on platforms and services in the name of security, potentially stifling freedom of expression and access to information.
When I previously wrote about Zoom and how its obscure development and IP-access practices poses a particular risk to the privacy and confidentiality of children and students during the COVID-19 pandemic, I argued:
“China’s understanding of privacy is vastly different: the data belongs to the organizations that collect it and any such organizations must grant unfettered access for government inspection, in the name of safety and security. Article 77 of its Cybersecurity Law ensures that data is collected and stored in China where full transparency and access must be provided to the Ministry of Public Security. Period.”
Once TikTok’s offices are shut down and hundreds of employees are laid off, it will likely be difficult for Canadians to get access to information about the company’s safety procedures, ask about online moderation and initiate Privacy Commissioner investigations, simply because TikTok will no longer exist in our country.
Impact on government credibility
I certainly don’t expect to have access to privileged information. But the secretive nature of Canada’s expulsion of TikTok (or is it truly aimed at its parent company, ByteDance?) risks undermining public trust in government decisions at a time when it could far better serve as an ideal opportunity for raising awareness among Canadians about genuine security concerns.
If the public perceives this move as an excessive, disrespectful overreach under the guise of security, it may bring into question foreign policy decisions and corporate law enforcement practices.
Ultimately, the manufactured dichotomy between a heavy-handed approach to urgent corporate expulsion and the resulting inability for government agencies to conduct future privacy investigations on behalf of Canadians appears both intentional and calculated.
While it is objectively true of all social media companies that they collect and process user information, it is also factually true that TikTok has, at least by all measures publicly available, demonstrated a degree of transparency on par with their industry peers.
Claudiu Popa, Author and Lecturer in Fintech Cybersecurity, Information Risk and Enterprise Privacy Management, University of Toronto
This article is republished from The Conversation under a Creative Commons license. Read the original article.