Therapy’s Privacy Crisis—How AI Apps Turn Trust into a Commodity
Discover why most AI therapy tools fail basic privacy tests—and what you can do to protect your clients from becoming the next data breach headline.


The current landscape of AI-powered therapy tools is abysmal. I recently analyzed 9 AI note-taking and session documentation products on the market, and every single one failed basic privacy and security tests. These failures aren't minor—many tools store sensitive session data indefinitely, require invasive logins, or process your most sensitive client data in the cloud, leaving it vulnerable to breaches, misuse, and even sale.
But there’s good news: Technology is evolving rapidly, enabling truly privacy-respecting AI. Lightweight, fully local processing solutions are here, ensuring that your session data never even gets created—much less stored or leaked. Scribular is the first platform to bring this breakthrough to healthcare and therapy, offering an entirely new level of trust and privacy. We genuinely hope others follow this lead.
How Today’s Therapy Tools Stack Up on Privacy
Company Name | Privacy Red Flags 🚩 | What This Means For You ⚠️ |
---|---|---|
Mentalyc | Stores notes and audio in cloud; requires login; data retained indefinitely. (Privacy Policy, Terms) | Your clients' private conversations could be vulnerable indefinitely. |
TheraPro AI | Uploads real-time audio to cloud; user tracking via accounts. (Privacy Policy, Terms) | Your clients' sessions might leave your control permanently. |
Yung Sidekick | Collects and anonymizes patient data; unclear data retention. (Privacy Policy, Terms) | "Anonymized" patient data can still be traced and used without your clear consent. |
Twofold Notes | Stores session notes online; requires an account; long-term data retention. (Privacy Policy, Terms) | Your client notes could linger online, at risk of breaches. |
Upheal | Integrates deeply with calendar and video tools; cloud processing and storage of session data. (Privacy Policy) | Sensitive session data and personal calendars might be vulnerable online. |
Eleos Health | Cloud-based processing integrated with EHR; stores extensive analytics data. (Privacy Policy) | Extensive patient details stored online may be exposed or misused. |
Lyssn | Data is used for AI training. (Privacy Policy) | Clients personal information can become enshrined in AI models forever. |
SOAP Note AI | Cloud-based note generation; unclear retention and data handling specifics. (Privacy Policy, Terms) | Lack of transparency about your stored session data increases vulnerability. |
S10.AI | Recommends not including PHI. Temporarily uploads data to cloud. (Privacy Policy, Terms) | Temporary uploads to the cloud still pose privacy risks during transmission. |
Scribular | No data created, no accounts, fully local. | You have nothing to lose because nothing ever leaves your device. |
How Technology Is Changing
AI doesn't have to mean "cloud-first" anymore. Previously, cloud processing was required due to resource constraints on local devices, forcing therapists into risky privacy compromises.
But now, lightweight AI models designed for local, on-device processing are available, eliminating the need to upload sensitive data. When data doesn't leave your device, you regain full control, removing enormous privacy risks.
Beware the Startup Trap
A word of caution: Most tech startups, especially those funded by venture capital, operate with a single goal in mind—rapid growth followed by an exit or "sell-out." The only way they can achieve exponential growth quickly is by aggressively collecting data to refine their algorithms, often at the cost of user privacy. Privacy becomes a secondary concern (at best) when your data is their biggest asset.
Startups like Mentalyc, TheraPro AI, and Eleos Health exemplify this issue—products built around data collection and storage, driven by the imperative to scale quickly, ultimately exposing users to significant privacy risk.
Who Can You Trust?
Large companies and VC-funded startups have little incentive to prioritize privacy above profits. However, you’re not completely without allies. Small indie developers—people whose names you know and whose motivations are transparent—often put ethics and user safety at the forefront. Privacy-first companies with business models aligned specifically to protect users, like Signal in messaging or Proton for email, are also something to look for.
Scribular is proudly part of this movement. It was built by a developer explicitly for his therapist partner, fully transparent about its intentions and operations. Scribular doesn't just protect your data—it doesn't even create any data, setting a new bar for what AI privacy can and should be.
If privacy matters to you and your clients, demand better. Ask tough questions, seek transparency, and support solutions that place privacy above profit.
Together, we can change the standard.
About the Author

Patrick
@pi0neerpat
Hi, I'm Patrick—a software developer and former FDA reviewer. I built Scribular for my therapist partner after seeing how documentation drained her energy and time. Scribular reflects my commitment to privacy, simplicity, and respect for therapeutic work: no servers, no tracking—just a thoughtful tool that helps therapists reclaim their time.
If this resonates with you—or you have thoughts, questions, or feedback—I'd love to hear from you.
Say hello 👋