The passion for generative artificial intelligence (AI), fueled by the ubiquitous ChatGPT, has taken the tech world by storm, both for its sensational capabilities and limitations.
However, Canadian companies have been slow to catch up with those in the United States, a new KPMG survey shows.
For this study, KPMG surveyed the opinions of 250 companies, 90 in Canada and 160 in the United States, each reporting annual gross sales of between $500 million and $1 billion.
The survey found that only 35% of Canadian companies are currently using AI in their operations, compared to 72% in the United States.
“Canadian organizations are lagging behind their American counterparts in adopting AI, and this comes at a time when developments in the field are evolving rapidly, particularly in the area of generative AI,” said Benjie Thomas, Canadian Associate Director, Consulting Services, KPMG in Canada.
More than 50% of Canadian respondents admit they could use AI more effectively. Currently, only about 30% are experimenting with ChatGPT or using AI in their call centers. In the United States, it’s double.
One of the biggest obstacles facing Canadian organizations is the lack of talent. Almost half say they don’t have the in-house expertise to verify the accuracy of their AI algorithms.
Another KPMG survey released in February shows that more than 50% of employees don’t trust AI at work, which could also explain its slow adoption.
Additionally, many argue that the datasets used to train AI algorithms are either too small, too large, informative, wrong, or improperly formatted.
“The first step for any company considering AI adoption is to ‘prep your data,'” said Zoe Willis, partner and national head of data and digital at KPMG in Canada.
According to Willis, this involves compiling a comprehensive set of data, mapping their location, assessing their accuracy, timeliness and relevance, identifying current and potential data gaps, and determining who has access to that data and whether they are well equipped to fill it analyze .
“Without quality data, AI algorithms are likely to produce biased, incorrect, misleading and unreliable results, and business consequences include errors that lead to poor business decisions,” she added.
But while more than 50% of Canadian respondents recognize the risks of making decisions based on poor-quality data, only 44% regularly hire independent external experts to check their AI algorithms for errors and biases, compared to 75% in the U.S .
A robust AI governance framework that ensures data integrity, privacy, accountability, security, and other considerations is also critical to AI adoption, but only 43% of Canadian organizations have one.
The need is even more critical now that regulators around the world are beginning to crack down on tools like ChatGPT over privacy and national security concerns.
The Canadian government, for example, introduced Bill C-27 last June in a first-ever comprehensive attempt to regulate AI. The bill has yet to be debated in Parliament.
Additionally, earlier this month, the Data Protection Commissioner of Canada announced that it would investigate ChatGPT over possible use of personal information without permission. Last week, the government of US President Biden also called for a public statement on possible liability measures for AI systems.
KPMG’s survey found that 72% of US companies have a responsible AI framework in place, but as technology changes rapidly, the need to adapt and take more protective measures is critical.
“Organizations need AI models that are efficient, sustainable, but also agile enough to adapt to the world around them,” said Kareem Sadek, Partner, Advisor and Head of IT Risk and Emerging Technologies, KPMG. “Organizations that don’t do this will be less competitive and less trustworthy and will end up falling behind. »
For the original article see IT world Canadaa sister publication of computer science direction.
French adaptation and translation by Renaud Larue-Langlois.
Extreme problem solver. Professional web practitioner. Devoted pop culture enthusiast. Evil tv fan.