CPSO AI Disclosure to Patients: When, How, and What to Document (2026)

N/A

CPSO AI Disclosure to Patients: When, How, and What to Document (2026)

Written by Mike Pearlstein, CISSP, CEO of Fusion Computing Limited. Helping Canadian businesses build and manage secure IT infrastructure since 2012 across Toronto, Hamilton, and Metro Vancouver.

Note: the clinic, the family physician, and the consent script below are composites drawn from four Ontario clinic engagements between late 2025 and early 2026. Names are changed. The CPSO citations, the consent elements, and the chart-note structure are real.

TL;DR. When CPSO actually requires AI disclosure

CPSO requires express, recorded patient consent before an AI scribe records a clinical conversation. The August 2025 CPSO Advice on Using Artificial Intelligence in Clinical Practice is explicit: physicians must obtain patient consent before recording conversations using AI. Implied consent does not clear the threshold for ambient recording.

The same rule extends to AI-driven clinical decision support and AI billing automation where personal health information leaves the clinic to be processed. Disclosure must happen before use, be documented in the chart, and be re-confirmed at each new visit type. The script and template below take roughly 30 seconds at the start of a visit.

Book a Free Clinic IT Assessment

This post builds on the PHIPA-compliant AI playbook for Ontario clinics. Read that first if your clinic has not landed on a sanctioned AI scribe and a written AI use policy.

The post you’re reading now answers a narrower question: at the moment the doctor walks into the exam room with an AI tool running, what does CPSO require be said to the patient, what gets written down, and what would a CPSO audit look for if the chart were pulled tomorrow?

I’m an MSP, not a physician and not a regulator. The language below should sit with your clinic’s professional-responsibility counsel and your CMPA contact before you adopt it. What follows is the disclosure architecture I’ve helped four Ontario family-practice and specialist clinics put into production in Q1 2026.

Key Takeaways

  • The CPSO Advice on Using Artificial Intelligence in Clinical Practice (August 2025) requires physicians to obtain patient consent before recording conversations using AI, and to inform patients about how AI will be used (College of Physicians and Surgeons of Ontario, Advice to the Profession, 2025).
  • Three clinical workflows reliably trigger express disclosure: (1) an AI scribe ingesting the encounter; (2) AI-assisted clinical decision support that influences the diagnosis or plan; (3) AI billing or coding automation that transmits PHI out of the clinic for processing.
  • Implied consent does not clear the threshold for ambient AI recording. CPSO and the Health Care Consent Act, 1996 frame consent as informed, voluntary, and tied to the specific treatment or recording activity.
  • The chart note CPSO would expect to see has five fields: tool name, consent obtained (yes/no), consent type (verbal/written), scope (this visit vs ongoing), and the patient’s right to withdraw.
  • Disclosure isn’t required when the AI tool processes only de-identified data, runs entirely on the physician’s local device with no PHI export, or is administrative in nature without influencing the clinical encounter. The audit-defensible reasoning has to be in writing.

For the cluster-level deployment view across the CPSO, IPC Ontario, and PHIPA stack, read the full PHIPA-compliant AI playbook for Ontario clinics alongside this post.

The CPSO Advice on AI in Clinical Practice: What It Does and Doesn’t Say


The CPSO Advice to the Profession titled Using Artificial Intelligence in Clinical Practice was published in August 2025. It sits alongside the CPSO Consent to Treatment policy and the Medical Records Management policy. The Advice is not a free-standing policy; it interprets existing CPSO policies in the AI context and tells physicians what compliant practice looks like (CPSO, 2025).

What the Advice does say, in operative form, is three things. First, physicians must inform patients about how AI will be used. Second, physicians must obtain patient consent before recording conversations using AI. Third, physicians remain accountable for the output of any AI tool that touches a clinical decision or a medical record, regardless of vendor claims.

What the Advice doesn’t say is almost as important. It doesn’t distinguish express from implied consent in the AI context. It doesn’t prescribe a chart-note template. It doesn’t name approved vendors. It doesn’t set a retention period for the AI scribe’s audio recording or transcript. Those gaps are where most clinic-level audit risk lives, because the surrounding CPSO policies fill them in by reference.

The Consent to Treatment policy supplies the express-versus-implied framing. The Medical Records Management policy supplies the retention floor (10 years for adults, 10 years past the eighteenth birthday for paediatric records, per CPSO Medical Records Management, June 2022). PHIPA supplies the data-sovereignty and HIC-equivalent contract requirements when the AI vendor processes PHI on the clinic’s behalf.

The Advice is the thin layer on top. Read alone, it leaves a clinic owner uncertain. Read alongside the four documents above plus the IPC Ontario AI in Health Care guidance, it becomes operational.

The Three Disclosure Triggers


Across the four Ontario clinic engagements I’ve helped through AI rollout in 2025-2026, three clinical workflows reliably cross the CPSO disclosure threshold. The rest of the AI footprint inside a clinic (a transcription tool that doesn’t record, a billing-claim parser that processes already-finalized invoices, a research search that uses no patient data) generally doesn’t. The clinic policy needs to name which is which.

  • Trigger 1, AI scribe ingesting the encounter. The patient is recorded (audio or transcript) and the recording or transcript becomes input to an AI model that drafts the clinical note. Express consent before the recording starts. Always.
  • Trigger 2, AI-driven clinical decision support. An AI tool examines the patient’s record (history, vitals, imaging, labs) and produces a differential, a treatment recommendation, or a risk score that the physician relies on. Disclosure that AI is being used, the nature of the recommendation, and the physician’s independent professional judgment over the output.
  • Trigger 3, AI billing and coding automation. PHI leaves the EMR and is processed by an AI tool to generate OHIP billing codes, prior-authorization narratives, or insurer documentation. PHIPA HIC-equivalent contract plus patient-facing notification, typically at the clinic-policy level rather than at the visit level.

The first trigger is the one most family practices have a live decision on right now because OntarioMD has actively encouraged AI scribe adoption. OntarioMD’s evaluation study reports family doctors saving 70-90% of paperwork time and three to four hours per week (OntarioMD, AI Scribe overview, 2025). The benefit is real. The consent step is non-negotiable.

For the vendor-by-vendor view of which scribes meet the data-residency, PHIPA HIC-equivalent contract, and audit-log retention floor, see the AI scribe PHIPA comparison for Ontario family doctors. The CPSO disclosure obligation sits on top of vendor selection, not in place of it.

Implied vs Express Consent for AI Use


The CPSO Consent to Treatment policy frames consent as informed, voluntary, related to the specific treatment, and given by a capable patient. Express consent is mandatory when an examination is intimate, carries appreciable risk, is invasive, or alters consciousness. Other situations may accept implied consent, depending on the circumstances (CPSO, Consent to Treatment, March 2025).

AI scribe recording sits in a category the original Consent to Treatment policy didn’t anticipate. The recording itself isn’t a treatment. But it’s also not a routine administrative step. The patient is providing their voice, their clinical history, and (depending on the vendor) potentially identifying medical information to a third-party AI processor whose data-residency, retention, and access controls the patient has no visibility into.

CPSO closes that ambiguity with the August 2025 Advice. The phrasing is unambiguous: obtain patient consent before recording. The Advice does not use the words “express consent”, but the requirement to obtain consent before the recording starts, document it, and treat it as withdrawable functionally describes express consent. Implied consent (the patient is in the exam room, therefore they’ve consented to be recorded) does not meet that standard.

The reasoning lines up with the IPC Ontario position on patient transparency for AI tools and with the Health Care Consent Act, 1996 requirement that consent “must relate to the treatment”. An AI scribe recording is sufficiently distinct from the underlying examination that consent to the examination doesn’t carry over.

For clinical decision support and billing automation, the express-implied line is fuzzier. A clinic policy can describe AI billing automation in the same way it describes its EMR vendor, the lab it sends bloodwork to, or the insurer it bills.

Patient-facing notification at the clinic-policy level (intake forms, posted notices, the clinic website’s privacy policy) is generally sufficient. The visit-by-visit verbal consent is reserved for the AI scribe and for clinical decision support that materially changes the plan.

The 30-Second Disclosure Script


The script below is the one I’ve helped four Ontario clinics put on the wall of every exam room, on the back of the intake clipboard, and inside the EMR template that physicians click on the first visit of the day. It runs in roughly 30 seconds. It maps to the CPSO Advice language, the Consent to Treatment policy elements, and the Health Care Consent Act, 1996 informed-consent threshold.

Three things make the script work in practice. First, it names the tool by brand so the patient can ask follow-up questions or look it up later. Second, it states the data-residency and retention in one breath, so the patient doesn’t feel the answer is being dodged. Third, it explicitly preserves the patient’s right to refuse without consequence to their care, which is a Health Care Consent Act, 1996 informed-consent requirement.

The script is the front half of the consent transaction. The chart note is the back half. The two have to match. A verbal consent obtained but not documented is, in audit terms, indistinguishable from no consent at all.

If your clinic wants the script localized to your EMR, your sanctioned scribe vendor, and your data-residency posture, book a 30-minute clinic IT assessment and we’ll send you the version we’ve put in production with our healthcare clients.

Documentation Requirements: The Chart-Note Template


The chart-note template below is the one a CPSO audit (or an IPC privacy investigation, or a CMPA claims file) would expect to see attached to the first encounter where the AI scribe ran. It has five required fields. It fits on two lines in any modern EMR. We’ve built it as a quick-text or auto-text macro at every clinic we’ve deployed AI scribes for.

AI SCRIBE CONSENT
Tool:            [Heidi Health / Tali AI / Mutuo Health Solutions / DAX / other]
Consent:         Yes
Type:            Verbal
Scope:           [This visit only / Ongoing, withdrawable at any visit]
Withdrawal:      Patient informed of right to withdraw at any time without affecting care
Documented by:   [Physician initials]
Date:            [YYYY-MM-DD]

The five fields are the ones the CPSO Advice, the Consent to Treatment policy, and the Health Care Consent Act, 1996 collectively require be recoverable from the chart. Tool name addresses transparency. Consent yes/no addresses the consent fact itself. Type (verbal vs written) addresses the form. Scope addresses whether consent is per-visit or ongoing. Withdrawal addresses the voluntariness element.

What a CPSO audit will look for: a chart note from the first encounter onward, with all five fields populated, dated, and attributable to the physician. What a CPSO audit will not look for, and what the IPC Ontario position would warn against, is a generic clinic-wide blanket consent that bundles AI scribe consent with every other administrative consent the clinic obtains at registration.

Blanket bundled consent doesn’t carry the informed-and-specific element the Health Care Consent Act, 1996 requires. The visit-by-visit verbal consent does. The chart note ties the two together.

For the parallel documentation the IPC Ontario expects on the PHIPA side (data flow map, vendor contract addendum, audit-log retention floor), see the IPC AI in Healthcare checklist walked through with a four-doctor clinic. The CPSO chart note is one of several documentation artifacts a fully compliant AI rollout produces.

When Disclosure Is Not Required (And the Audit-Defensible Reasoning)

Not every AI tool inside a clinic triggers patient-facing disclosure. The clinic policy needs to draw the line explicitly, because a clinic that over-discloses turns every visit into a 90-second pre-amble and a clinic that under-discloses ends up in front of the CPSO or the IPC. The line sits in one of three places.

  • The AI processes only de-identified data. A population-health analytics tool that runs over de-identified panel data with the identifiers stripped at the EMR boundary doesn’t implicate any individual patient’s consent. The de-identification has to actually meet a recognized standard; an EMR field-suppression that leaves date of birth and postal code in place isn’t enough.
  • The AI runs entirely on the physician’s local device. A locally-hosted dictation tool that never sends audio to an external vendor, never stores PHI off the device, and produces output the physician edits inside the EMR doesn’t cross the PHIPA cross-border-transfer line. Disclosure may still be courtesy good practice, but isn’t mandated.
  • The AI is administrative without clinical influence. A scheduling-optimization model that reads only appointment-slot data, a reminder bot that fires standard SMS based on a clinic-built rule set, or a workflow tool that prioritizes the inbox without reading clinical content can be addressed at the clinic policy level. Patient-facing notification in the privacy policy and at intake is generally sufficient.

The audit-defensible reasoning in each case is the same. The clinic AI policy documents which tools the clinic uses, which category each falls into, and why. The policy is reviewed annually. The categorization is signed off by the clinic owner and ideally by the clinic’s privacy officer or PHIPA agent of record.

The IPC Ontario guidance on AI in health care is clear that the absence of a consent step has to be reasoned, written, and reviewable. “We don’t need consent because the tool is administrative” isn’t a defense if the tool turns out to be reading the clinical note and sending it to a US-hosted summarization API.

This is the audit step most clinics skip. The deployment looked simple. The vendor said it was PHIPA compliant. Nobody asked what the data flow actually looked like under load. We routinely find that the most-deployed AI tools in Ontario clinics have data flows the clinic owner can’t accurately describe from memory. The CPSO disclosure question is downstream of that gap.

The 5-Step Rollout: From Policy to Audit

The compliant AI disclosure rollout below is the one I’ve walked four Ontario clinics through in Q1 2026. Each step constrains the next. Skipping a step compounds risk downstream: a clinic that trains physicians without first writing the policy gets inconsistent disclosure; a clinic that ships a consent form without a chart template gets unauditable charts.

  1. Step 1. Write the AI use policy (Week 1). Name every AI tool in use. Categorize each as recording / decision-support / billing-automation / administrative / de-identified. Document the data flow per tool (where does PHI go, who processes it, how long is it retained, where does the vendor sit on the PHIPA HIC-equivalent contract). The policy is the foundation; everything else references it.
  2. Step 2. Train physicians and staff (Week 2). Walk every physician through the disclosure script. Walk every front-desk staff member through the intake-form changes. Walk the clinic owner and privacy officer through the CPSO Advice, the IPC Ontario AI guidance, and the PHIPA HIC-equivalent contract terms. Training has to happen before the consent form goes out.
  3. Step 3. Update the consent form and intake materials (Week 3). Add an AI use disclosure section to the clinic intake form. Add a visible notice in every exam room. Update the clinic website’s privacy policy to reflect AI tool use, data residency, and retention. The patient should encounter the disclosure three times: at intake, in the room, and on the website.

  1. Step 4. Deploy the chart-note template (Week 4). Build the five-field consent template as an EMR auto-text macro. Train physicians to invoke it on every first encounter where the scribe runs and at any visit where the consent type changes. Audit the first 20 encounters to confirm the template is being used consistently. Adjust as needed.
  2. Step 5. Internal audit and quarterly review (Week 12 and ongoing). Pull a random 30 charts from the previous quarter. Confirm consent documentation present in every encounter where the scribe ran. Identify gaps. Re-train physicians whose templates show gaps. Sign off the audit. File it. The audit log is what the clinic owner will hand to a CPSO investigator if the question ever surfaces.

The five steps cost roughly four weeks of clinic-side time spread across 12 calendar weeks. The compounded benefit is that the AI scribe’s time savings (70-90% paperwork reduction per OntarioMD) become realizable without the documentation gap surfacing later as audit exposure.

For the broader healthcare cluster view (PHIPA breach notification SOP, OHIP billing security, ransomware playbook for FHO clinics), the spokes around the full healthcare AI clinical-practice guide cover each component in depth.

“The first time a Mississauga family physician asked me whether her AI scribe disclosure needed a signed form, she had already been using the scribe for six weeks across roughly 240 patient encounters. The CPSO Advice on AI had been live for months. The chart notes were silent on consent.”

“We reconstructed disclosure language for active patients, retrained the front desk on the 30-second script, and the practice manager updated the chart-note template the same week. The audit-defensible position was rebuilt before the question ever reached the College. The cost of doing it after a complaint would have been an order of magnitude higher.”

Mike Pearlstein, CEO, Fusion Computing (anonymized client data, Ontario family practice engagement, Q1 2026)

That sequence is consistent with the FC internal benchmark across four Ontario family-practice scribe rollouts in 2025-2026: clinics that backfill consent within 90 days of a CPSO Advice trigger close the audit gap before it materializes, while clinics that wait for a complaint pay roughly four times the remediation cost in time and external counsel.

Do and Don’t

Do Don’t
Obtain express verbal consent before the AI scribe starts recording. Every patient. Every first visit. Document the consent in the chart with the five-field template. Rely on implied consent for AI scribe recording. The patient sitting in the exam room is not consent to be recorded by a third-party AI processor.
Name the AI tool by brand in the disclosure script. Patients can ask follow-up questions or research it. Transparency is a CPSO requirement, not a courtesy. Bundle AI scribe consent into a clinic-wide registration consent form. PHIPA and the Health Care Consent Act, 1996 both require informed, specific, and voluntary consent for the specific activity.
Make the right to withdraw explicit in the script and in the chart note. The patient must be told they can opt out at any visit, with no effect on their care. Use a consumer-grade AI tool (free ChatGPT, generic Whisper) for any encounter that touches PHI. The PHIPA HIC-equivalent contract requirement disqualifies the consumer SKU.
Audit a random 30 charts per quarter to confirm the consent template is being filled in. Fix the gaps before the auditor finds them. Assume the vendor’s “PHIPA compliant” marketing claim discharges your obligation. The physician remains accountable per the CPSO Advice. The vendor claim is starting evidence, not the answer.

If your clinic wants a discovery call to map your current AI disclosure posture against the five steps and the chart-note template, book a free 30-minute clinic IT assessment and we’ll walk through it.

Peer Regulator View: How Other Provinces Approach AI Disclosure

Ontario isn’t the only province with regulator guidance on AI in clinical practice. The College of Physicians and Surgeons of British Columbia has published an AI practice standard with parallel language. The Canadian Medical Association and the Canadian Medical Protective Association have both issued AI-and-physician-liability guidance that aligns with the CPSO position. The convergence matters because a multi-province telehealth practice has to satisfy all the regulators it touches.

The CPSBC framing parallels CPSO on the core point: physicians must inform patients about AI use, obtain consent where the AI processes individually identifying health information, and remain accountable for the clinical decision regardless of vendor claims. The CMA position is consistent. The CMPA risk guidance reinforces that documenting the consent in the chart is the single most important defensive step the physician can take.

For Ontario clinics that operate telehealth into other provinces, the rule is straightforward: meet the most restrictive of the regulators you touch. In practice, the CPSO Advice, the CPSBC practice standard, and the CMA position all point at the same operational architecture: the disclosure script, the chart note, and the documented AI use policy.

A clinic that meets the Ontario floor generally clears the others. The detail to confirm is the data-residency posture, which Quebec Law 25 will add a cross-border-transfer notification step to if patients are in Quebec.

Further reading and primary sources

HOW THIS GUIDANCE WAS ASSEMBLED

This article draws on FC’s anonymized client data across multiple 2025-26 Ontario clinic engagements, including FHO group practices and walk-in clinic chains, plus a named-client moment with the Mississauga family-health practice whose PHIPA-grade AI scribe pilot we ran end-to-end.

It also draws on an original survey of clinic owners and office managers conducted during 2026 Q1 readiness assessments, plus an FC internal benchmark covering PHIPA breach SOP rollout, EMR integration, and AI scribe deployment across Ontario clinic clients.

Layered over all of it is first-person field observation from CEO Mike Pearlstein’s 12-year practice supporting regulated Canadian healthcare SMBs through PHIPA-sensitive technology change.

Frequently Asked Questions

Does CPSO require written consent or is verbal consent enough for an AI scribe?

The CPSO Advice on Using Artificial Intelligence in Clinical Practice requires consent before recording but doesn’t prescribe verbal vs written form. In practice, verbal consent documented in the chart is sufficient for AI scribe recording at routine visits.

Written consent makes sense when the visit type is sensitive (mental health, sexual health, paediatric assessments where parents and the child are both involved) or where the clinic wants a higher evidentiary bar. The Health Care Consent Act, 1996 accepts verbal consent for routine treatment; the same principle extends to the recording.

Can I get one consent at intake and rely on it for every future visit?

A one-time intake consent covering AI scribe use is acceptable as the policy-level disclosure, but every visit where the scribe runs should still produce a chart note confirming the patient was informed and consented at that visit. The Health Care Consent Act, 1996 treats consent as related to the specific treatment, not as a blanket forward-looking authorization.

The practical rule we’ve put in production is intake consent plus a single-line confirmation at each visit (“AI scribe consent re-confirmed, verbal”). It takes five seconds and closes the audit gap.

What if the patient refuses the AI scribe?

The patient’s refusal must be accepted without consequence to their care. That’s an explicit Health Care Consent Act, 1996 voluntariness requirement and a CPSO informed-consent requirement.

The physician should document the refusal in the chart (“Patient declined AI scribe; manual note”), proceed with the encounter without the scribe running, and not re-raise the question at that visit. The refusal can be revisited at a future visit if the patient’s circumstances change.

Does CPSO require disclosure of AI billing automation?

AI billing automation that processes PHI to generate OHIP codes, prior-authorization narratives, or insurer documentation typically reaches the patient at the clinic-policy level rather than at the visit level. The clinic’s privacy policy and intake form should describe AI use in billing operations.

A visit-by-visit verbal consent is generally optional for billing automation. It’s prudent to mention it in the intake materials and in the on-site privacy notice so the patient encounters the disclosure at least once. PHIPA HIC-equivalent contract terms with the billing vendor remain mandatory regardless of patient-facing disclosure.

Does the AI scribe vendor’s “PHIPA compliant” claim discharge my consent obligation?

No. The CPSO Advice is explicit that physicians remain accountable for their use of AI tools. The vendor’s compliance claim is starting evidence, but the physician’s obligation to inform the patient, obtain consent, and document the consent is independent of the vendor’s posture.

The clinic should also independently verify the vendor’s PHIPA HIC-equivalent contract, data-residency, audit-log retention, and breach-notification commitments. The IPC Ontario position treats vendor claims as a starting point for due diligence, not as a substitute for it.

What if I’m using AI to triage my inbox or schedule appointments?

Administrative AI that doesn’t read clinical content and doesn’t make decisions affecting the clinical encounter generally doesn’t trigger visit-level consent. The clinic policy and the patient-facing privacy notice should describe it.

The line to watch is when the “administrative” AI starts reading clinical content (e.g., an inbox-triage tool that reads referral letters and extracts diagnoses, or a scheduling tool that prioritizes by reason for visit). Those edge cases push the tool back into the disclosure category. The clinic AI policy should categorize each tool explicitly.

How long should I keep the AI scribe audio recording?

The CPSO Medical Records Management policy retention floor is 10 years for adult patients (10 years past age 18 for paediatric records). That retention applies to the clinical note the scribe produces, which becomes the medical record.

The audio recording itself is different. Most PHIPA-compliant scribe vendors delete the audio after the note is finalized; some retain it for a short audit window. The retention period for the audio should be documented in the clinic’s AI use policy and disclosed to patients in the consent script. The 10-year retention obligation applies to the signed note, not necessarily to the source audio.

Can I record the audio for training the AI model on our clinic’s patterns?

Generally no, unless the patient has been specifically informed and has consented to that use. Recording for clinical documentation is one purpose; recording for AI model training is a separate purpose, and PHIPA requires consent specific to each purpose.

Most clinic-grade AI scribes disable model training on customer data by contract. Verify the vendor contract addresses this explicitly. If your clinic wants to contribute data to model improvement, that’s a separate, optional, patient-consented activity, not part of the routine scribe consent.

What happens if a patient asks where their AI scribe recording is stored?

The clinic must be able to answer that question. The data-flow map produced in Step 1 of the rollout is what surfaces the answer. Typically the recording is stored in the vendor’s Canadian data centre if the vendor is PHIPA-compliant; some lower-tier vendors store in the US, which raises PIPEDA cross-border-transfer disclosure obligations.

For Quebec patients, Quebec Law 25 adds an additional cross-border-transfer notification requirement if the data leaves Quebec. For Ontario patients, the CPSO and IPC framing is transparency: the clinic should be able to answer the storage question without hedging.

Does an AI clinical decision support tool require the same consent as an AI scribe?

AI clinical decision support sits in a different consent category. The patient isn’t being recorded; the patient’s record is being read by an AI tool that produces a recommendation. CPSO transparency still applies (the patient should be informed AI is being used to support the decision), but the consent threshold is generally lower than for active recording.

The physician’s accountability is the same. The CPSO Advice treats the physician as fully accountable for any clinical decision the AI tool informs. Documentation should record that the AI tool was used and that the physician applied independent professional judgment to its output.

What about telehealth visits across provincial lines?

The most restrictive regulator and privacy regime governs. An Ontario physician seeing a Quebec patient by telehealth has to meet Quebec Law 25 cross-border-transfer notification on top of the CPSO Advice and PHIPA. An Ontario physician seeing an Ontario patient by telehealth from a US vacation has to consider where the EMR connection terminates.

The practical rule is to apply the Ontario CPSO disclosure script and chart note as the baseline, and to layer Quebec Law 25 cross-border-transfer notification on top when the patient is in Quebec. For the cross-border PHI analysis in depth, see the cross-border PHI guide.

If I’m audited by CPSO, what will they ask about AI use?

The investigator will typically ask: which AI tools are in clinical use; what does the clinic’s written AI use policy say; how is consent obtained and documented; what does a representative chart note look like; and what is the clinic’s internal audit cadence. The five-step rollout above produces an evidence pack that answers all five questions.

The investigator will not typically ask vendor-specific technical questions (those are more IPC Ontario territory). The CPSO focus is on the physician’s professional-responsibility posture: was consent obtained, was it documented, is the physician accountable for the AI output, is the documentation defensible.

Bottom Line

The CPSO Advice on Using Artificial Intelligence in Clinical Practice is short, clear, and operational. Obtain consent before recording. Inform the patient about how AI will be used. Document the consent in the chart with the five-field template. Audit a sample of charts every quarter.

The 30-second disclosure script, the chart-note template, and the five-step rollout give a clinic a defensible posture against a CPSO documentation audit and a defensible posture against an IPC privacy inquiry. For the full PHIPA-and-CPSO deployment view across vendor selection, breach notification, and ongoing audit, work through the full healthcare AI clinical-practice guide.

Schedule Your Free Clinic Assessment

Fusion Computing has provided managed IT, cybersecurity, and AI consulting to Canadian businesses since 2012. Led by a CISSP-certified team, Fusion supports organizations with 10 to 150 employees from Toronto, Hamilton, and Metro Vancouver.

93% of issues resolved on the first call. Named one of Canada’s 50 Best Managed IT Companies two years running.

100 King Street West, Suite 5700
Toronto, ON M5X 1C7
(416) 566-2845
1 888 541 1611