Mahi Sall, Advisor, Fintech-Bank Partnerships, Payments and Financial Inclusivity
January 25th, 2023
EP&T | Christopher Reynolds | Nov 4, 2020
Last week, privacy watchdogs revealed that five million images of shoppers’ faces were collected without their consent at a dozen of Canada’s most popular malls.
Real estate company Cadillac Fairview embedded cameras equipped with facial-recognition technology, which draws on machine-learning algorithms, in digital information kiosks to discern customers’ ages and genders, according to an investigation by the federal, Alberta and B.C. privacy commissioners.
But the commissioners had no authority to levy fines against the firm, or any companies that violate Canadians’ personal information, an “incredible shortcoming of Canadian law that should really change,” B.C. information and privacy commissioner Michael McEvoy said in an email.
Despite its status as an artificial-intelligence hub, Canada has yet to develop a regulatory regime to deal with problems of privacy, discrimination and accountability to which AI systems are prone, prompting renewed calls for regulation from experts and businesses.
“We are now being required to expect systematic monitoring and surveillance in the way that we walk down the road, drive in our cars, chat with our friends online in small social-media bubbles. And it changes the way that public life occurs, to subject that free activity to systematic monitoring,” said Kate Robertson, a Toronto-based criminal and constitutional lawyer.
At least 10 Canadian police agencies, including the RCMP and Calgary and Toronto police services, have used Clearview AI, a facial-recognition company that has scraped more than three billion images from the Internet for use in law enforcement investigations, according to a report co-written by Robertson.
Other Ontario police forces also may be “unlawfully intercepting” private conversations in online chat rooms via “algorithmic social-media surveillance technology,” according to the September report from the University of Toronto’s Citizen Lab and International Human Rights Program.
“We have seen the lack of clear limits and focused regulation leaving an overly broad level of discretion in both the public and police sectors that is a call to action for governments across the country,” Robertson said in a phone interview.
Canada needs to roll out concrete rules that balance privacy and innovation, said Carolina Bessega, co-founder and chief scientific officer of Montreal startup Stradigi AI. Public trust in artificial intelligence becomes increasingly crucial as machine-learning companies move from the conceptual to the commercial stage, she said.
The regulatory vacuum also discourages businesses from deploying AI, holding back innovation and efficiency _ particularly in hospitals and clinics, where the implications can be life or death.
The National Crowdfunding & Fintech Association (NCFA Canada) is a financial innovation ecosystem that provides education, market intelligence, industry stewardship, networking and funding opportunities and services to thousands of community members and works closely with industry, government, partners and affiliates to create a vibrant and innovative fintech and funding industry in Canada. Decentralized and distributed, NCFA is engaged with global stakeholders and helps incubate projects and investment in fintech, alternative finance, crowdfunding, peer-to-peer finance, payments, digital assets and tokens, blockchain, cryptocurrency, regtech, and insurtech sectors. Join Canada's Fintech & Funding Community today FREE! Or become a contributing member and get perks. For more information, please visit: www.ncfacanada.org
![]() | ![]() | ![]() |
Support NCFA by Following us on Twitter!Follow @NCFACanada ![]() |
Leave a Reply