Mental Health Is Private.
Our Technology Reflects That.

We built Anchor because we believe everyone deserves a safe space to process their thoughts without worrying about data brokers, cloud breaches, or corporate surveillance.

Our Story

In 2024, the mental health app industry generated billions in revenue. It also generated unprecedented data privacy scandals, with major platforms caught selling sensitive user data to advertisers and data brokers.

When I looked for a tool to supplement my own therapy between sessions, drawing from my background intersecting medicine and technology, I realized the problem: existing apps required a tradeoff between utility and privacy. To get AI-driven insights, you had to upload your most vulnerable moments to a corporate cloud.

I decided to reject that tradeoff.

Anchor was born from a simple thesis: what if we combined the power of modern Large Language Models and on-device processing to build a companion that lived entirely on your phone? What if, mathematically and architecturally, we made it impossible for us to ever see your data?

Architecture of Trust

How we guarantee your privacy at the engineering level.

Zero-Cloud Sync

We deactivated CloudKit and removed all remote sync capabilities from the app architecture. Your session data physically cannot leave your device without your explicit export command.

Local Processing

We use Apple's native Speech framework for transcription, ensuring speech-to-text happens entirely on-device, leveraging the neural engine in your iPhone.

AES-256 SwiftData

All session transcripts, mood tracking, and clinical insights are stored via SwiftData and encrypted at rest using iOS's native Data Protection API.