skip to main content

AI in practice

September 18, 2025

Published September 18, 2025, by Canadian Health Care Network. Used with permission.

Author: Abigail Cuckier

How doctors are embracing–and evaluating–artificial intelligence tools.

Artificial Intelligence continues to be woven into the fabric of so many facets of our lives. This increasingly includes healthcare applications in areas including administration, scheduling, diagnosis, research and patient-facing support. With all these possibilities, you may want to introduce AI into your own practice and are wondering how to get started.

According to the 2024 National Survey of Canadian Physicians, commissioned by Canada Health Infoway and the Canadian Medical Association, 7% of doctors surveyed said they used artificial intelligence or machine learning in their main practice to support patient care. This was a notable increase from 2% in 2021, and the number has presumably increased since, with the rapid growth of AI.

Despite the rise, factors like privacy or security concerns or uncertainty about new technologies are keeping more physicians from taking the leap.

Dr. Puneet Seth is a family doctor in Toronto and founder of Nymble Health. Trained by clinical experts, Nymble leverages AI to deliver clinically curated information and support to help patients in their weight management journey. An early adopter of AI in his clinical practice, Dr. Seth suggests physicians start by simply learning about it.

"I think it's important to appreciate that, whether or not clinicians like it or are comfortable with the idea of AI, you can't really escape it because it is permeating society," he said. "I would argue it's a necessary responsibility for physicians to educate themselves about what's happening in that space. Patients will come and ask, 'Should I do this? Can I use this?' And if you don't have any context, you can't be helpful. "I would say to carve out a little bit of time to understand what's happening. And stay up to date because it is a rapidly evolving space." Physicians can start by researching AI terms, its learning models and its range of applications. These include patient diagnosis by analyzing medical imaging data, such as X-rays, MRIs and CT scans; transcribing medical documents; enhancing health research; appointment scheduling; and entering and extracting data. There are also many patient-facing tools.

AI scribes and clinical reference tools, such as OpenEvidence and Glass Health, are common places for physicians to start, Dr. Seth said. He began using an AI ambient scribe in 2022. "This was due to my innate curiosity and wanting to minimize the amount of administrative work. I already had an optimized practice in terms of time spent on documentation, as I was using templates and EMR features."

The larger goal was not having to think about the documentation during the patient visit.

"The biggest impact was liberating me from having to worry about documentation and the ability to be more present during the visit."

When evaluating an AI tool, Dr. Seth suggests looking at what it claims to do. "If it is claiming to do many things at once, claiming that it can address any and all medical concerns, I don't think we're there yet. The safety and standards have not come to that point."

The second thing, he says, is asking who is building it, and what validation it has. "Are they respected in the field? Is there clinical research supporting it or are there testimonials from real people? And, in addition to the people in the company, who is backing it? Is it supported by medical associations or are there others who are vetting it?"

Dr. Amjed Kadhim-Saleh is a family doctor in Toronto. As his practice grew, so did the paperwork. So, he and his wife, Mary Aglipay, who is a data scientist, created Pippen, which provides medical scribe capabilities, as well as clinical, billing and administrative support. Launched last year, 700 doctors across Canada are now using it.

"I talk to a lot of doctors, demonstrating Pippen, and my favourite part is when they see it and see how transformative it is, their eyes light up," he said. "The hype is real. This is a transformative technology. A study from OntarioMD, Women's College Hospital and the eHealth Centre of Excellence found that 76% of doctors said using an AI scribe reduced cognitive load and 70% said it saved them up to four hours a week in administrative tasks. So I'd say, definitely try it."

Dr. Kadhim-Saleh suggests taking advantage of AI providers' free trials to test and compare them. "Also, the more you do it, the better you get. Sometimes, doctors try (an AI scribe) and it's not exactly how they would want their note to look, so they stop. But through continuous feedback, you can refine and customize it to suit your needs."

PHIPA and PIPEDA

Pippen is also compliant with regulations like PHIPA (Ontario's Personal Health Information Protection Act) and PIPEDA (Canada's federal Personal Information Protection and Electronic Documents Act). Chantz Strong, the Canadian Medical Protective Association's executive director, technology and analytics and chief privacy officer, said physicians should consider whether AI tools have such compliance.

"Physicians should understand the risks and benefits of the specific AI tools they plan to use in their practice," Strong said. "To ensure that the tools are safe and reliable, consider whether there has been regulatory approval from Health Canada and whether a professional association or medical society has endorsed specific technologies or vendors."

Strong said a physician should also understand the privacy and security parameters of a tool and consider how it collects and uses patient data, and ensure that patients have provided informed consent regarding the collection, storage or sharing of their health information.

"Also remember that all AI models will have some inherent bias," he said. "Bias can also be present in how an AI tool is deployed, implemented, operated and maintained. To mitigate bias, it is critical to recognize that it can and does occur, even in the absence of harmful intent."

Doctors should also consider the level of risk a certain tool carries, he said. "A low-risk example would be something that helps clinical workflow or an AI scribe, compared to something that has high patient impact, such as robot-assisted surgery. The other aspect that modifies the risk is the level of autonomy."

In other words, if the AI is just providing information or writing an email, it is probably low risk. But if it's performing healthcare on its own with very little supervision, that is higher risk.

"There is little regulation in Canada so far. So right now, a high-risk tool has probably been examined for safety but those that have less patient impact and autonomy may have had less official testing and regulation. Colleges have issued guidance on AI, but in general, it's a little bit of buyer beware."

Strong said the CMPA believes AI is going to have a significant impact on healthcare and so the organization is working to keep on top of it. "We would like this to evolve in a way that doesn't push all of the liability and accountability and responsibility for understanding and using these highly technical tools solely on physicians. We hope that developers, vendors, healthcare providers and medical associations get together and figure out how to address that. And we would love it if we moved a little faster on legislative frameworks in Canada."

The CMPA has online resources for physicians regarding AI, and Strong said if a doctor has specific questions to call the organization directly. When evaluating an AI tool, he offers a few questions to ask:

  • What are my regulatory requirements?
  • Do colleges have specific guidance in this area?
  • Will it improve care for my patients?
  • Will it improve my practice?
  • Is there evidence that it's safe?
  • Is there evidence on its efficacy?
  • Is it practical?
  • Is there training for me and my staff?
  • How will it integrate with my software and my life as a physician?

"Physicians already have training and experience in this type of decision-making," he said. "Do what you normally would when incorporating a new tool. Your responsibility, whether it's AI or hiring a medical student to take notes, is to be accountable for everything you do when providing care. So that doesn't change."