
How to Use AI in Therapy Documentation
There's a version of this conversation that goes like this: "AI is going to replace therapists." That's not what this article is about. AI is not going to replace therapists. But it's quietly reshaping how therapy documentation gets done — and the design choices vendors make about where AI fits matter more than whether it's used.
The documentation burden is real. Studies estimate therapists spend 25-50% of their working hours on administrative tasks, with clinical documentation as the single largest time sink. That's time not spent with clients, not spent on professional development, not spent on the rest of your life. The market has noticed: AI-assisted documentation is now standard in most new clinical platforms, and existing EHRs are racing to add it.
This piece is an honest map of the territory: what AI is genuinely good at in therapy documentation, what it's not, where the design choices split, and how Theracharts deliberately sits in a narrower lane than most "AI documentation" vendors.
The two AI documentation patterns therapists encounter
Two distinct patterns dominate the AI-in-therapy-documentation market right now, and they make very different trade-offs.
Pattern A: AI session-note drafting. The vendor generates the entire session note (SOAP, DAP, BIRP) from your input — bullet points, dictation, or in some cases live session audio. The AI's draft becomes your note. You review and approve. The note lives in the vendor's system. Examples: Blueprint, Eleos, Upheal, Mentalyc, and most "AI scribe for therapists" tools.
Trade-off: real time savings, but your legal medical record now lives across two systems (your EHR for billing/scheduling, the AI vendor for notes). Audit paths get complicated. Migration risk is real. And you're trusting the AI vendor's clinical voice to approximate yours under time pressure.
Pattern B: AI clinical updates that paste into your EHR. The vendor generates a data-grounded summary — assessment trends, score deltas, alerts triggered, completion patterns — that you copy and paste into your EHR's existing note as the measured-data section. Your EHR keeps the note. The clinical narrative (subjective, observed, what you did and why) stays in your hands. Theracharts ships this pattern.
Trade-off: smaller time savings on note-writing itself, but your legal medical record stays in one place. The AI is doing one narrow job well — synthesizing measured data — instead of trying to write your clinical voice. Less ambitious; less risk.
Both patterns are legitimate. Which one fits depends on how much you trust your audit chain, how much you value EHR-as-single-source-of-truth, and how much of the note-writing time-sink is data-summary vs. clinical-narrative for your practice.
What AI is good at (and what it's not)
AI is excellent at structure. Given a set of session details, it can organize information into the correct sections of a SOAP note — subjective observations into S, objective data into O, clinical reasoning into A, next steps into P. It doesn't get the format wrong. It doesn't forget a section. It doesn't leave the plan blank because you ran out of time.
AI is also good at pulling in context. If your documentation tool is connected to your client's assessment data, the AI draft can automatically include recent PHQ-9 scores, trend directions, active treatment goals, and clinical alerts. Information you'd normally have to look up and manually transcribe is already there.
What AI is not good at is clinical judgment. It can draft a note that says "Client's PHQ-9 decreased from 14 to 10, suggesting improvement in depressive symptoms." It cannot determine whether that improvement is meaningful for this particular client in this particular context. That's your job.
AI also can't capture the therapeutic nuance that matters. The shift in a client's tone when they mentioned their mother. The moment of insight that happened in the last five minutes. The nonverbal cue that contradicted what the client was saying. These details require a therapist's eye and ear — and they require your editing hand in the final note.
The human-in-the-loop principle
The most important design principle in AI-assisted clinical documentation is this: the therapist reviews everything before it saves.
Related reading: session notes taking too long, going paperless, and voice dictation for notes.
This isn't a philosophical nicety. It's a clinical and ethical requirement. AI-generated content can be wrong. It can hallucinate details. It can misinterpret your input. It can use language that doesn't match your clinical voice. It can miss something critical.
The review step isn't optional overhead — it's the whole point. The AI handles the structural work (formatting, organizing, pulling in data). You handle the clinical work (accuracy, nuance, judgment, voice). Together, the note is written faster and more completely than either could produce alone.
Any tool that saves AI-generated content directly to a clinical record without therapist review is a tool you should not use. The standard is simple: nothing touches the record without your approval.
Voice dictation: where it lives now
If you want voice dictation for session notes, the right place to look is your EHR's mobile app. Most major EHRs (SimplePractice, TherapyNotes, Jane) ship native voice-dictation in their phone apps — speech-to-text directly into the note field where the legal record lives. The quality has improved dramatically; modern speech-to-text handles clinical terminology, proper nouns, and assessment abbreviations (PHQ-9, GAD-7, BIRP) with high accuracy.
This is the right architectural place for it: the note lives in the EHR, the audio-to-text happens in the EHR's app, no separate vendor handling speech audio of clinical content. Theracharts intentionally does not handle session audio or session-note dictation — that's not the lane we're in. Our AI work is bounded to data summarization (assessment trends, score changes, alerts) which doesn't require speech input.
Addressing the skepticism
Therapists have legitimate concerns about AI in documentation. Here are the most common ones and how to think about them.
"Can AI really capture what happened in my session?"
It doesn't need to capture everything. It needs to produce a structured first draft that's close enough to save you time. You handle the nuance in the edit. Think of it as a competent administrative assistant who writes the first draft of your note — not a replacement for your clinical voice.
"What about confidentiality?"
This is a real concern and the answer depends entirely on the tool. Any AI documentation tool handling therapy notes must be HIPAA compliant, with appropriate BAAs, encryption, and access controls. Data should be encrypted at rest and in transit. The AI processing should happen in a secure, compliant environment. Ask your vendor about their infrastructure before using any AI documentation tool.
"Will insurance companies or boards have issues with AI-written notes?"
AI-assisted notes are still your notes. You reviewed them, edited them, and approved them. The final document is your clinical work product, regardless of how the first draft was generated. That said, regulations in this space are evolving — stay informed about your state licensing board's guidance on AI in clinical practice.
"What if the AI gets something wrong and I miss it?"
This is the strongest argument for taking the review step seriously. Read the draft. Every time. Don't skim. The time savings from AI are real, but they only work if you're actually reviewing what it produces. A wrong detail in a clinical note is worse than a slow note.
What to look for if you want AI session notes (Pattern A)
If you've decided on the AI session-note vendor pattern, evaluate carefully:
Where does your legal medical record live? If it's in the AI vendor, you're trusting them with audit defensibility. If it's in your EHR with a copy from the AI vendor, you have two systems to keep in sync. Either is workable, neither is automatic.
How does it handle session audio? Some vendors record live sessions; others only take written/dictated input post-session. Live recording is a higher trust ask — clarify your jurisdiction's wiretap and recording-consent laws and your clients' explicit consent path.
BAA, encryption, audit logs. Non-negotiable. Verify before signing.
Multiple note formats. SOAP / DAP / BIRP / GIRP at minimum. Locked-in formats become a liability when your supervisor or licensing board changes preferences.
Mandatory therapist review before save. No auto-save, no background writes to the clinical record. If a vendor offers "fully automated notes," that's a red flag — clinical judgment doesn't get to be background.
What to look for if you want AI clinical updates (Pattern B)
If your model is EHR-as-source-of-truth + AI for the data-summary layer, the bar is different:
Does the AI summarize measured data only? The clearest version of Pattern B doesn't try to write your clinical narrative — it produces a structured measured-data summary (scores, deltas, alerts, completion stats, goal progress) that paste-augments the note you write in your EHR. Theracharts ships this exact pattern.
How is “since last update” tracked? A good clinical-update generator picks up automatically from your last export, so the next update naturally covers the inter-session window without date pickers.
Plain text or markdown. Pattern B output should paste cleanly into any EHR's note field — no rich text formatting that breaks SimplePractice's progress note, no JSON-laden output you have to clean up.
The bottom line
AI in therapy documentation isn't going away — it's becoming foundational. The choice that matters isn't whether to use it; it's which architectural pattern you're comfortable with. AI-as-note-writer (Pattern A) saves more time but moves part of your legal record outside your EHR. AI-as-data-summarizer (Pattern B) preserves the EHR-as-single-source model but offers narrower time savings. Neither is wrong; they fit different practices.
What's not legitimate, regardless of pattern, is AI generating clinical content without therapist review before save. The clinical content is yours. The judgment is yours. The voice is yours. The AI is doing structural work; you're doing the work that requires being in the room.
Theracharts is in Pattern B — we generate AI clinical updates that paste into your EHR's note fields, and we deliberately stay out of session-note territory. Your EHR keeps the note; we add the data layer. Get started free.