Privacy Protocol
Effective Date: May 2026
Architecture Objective: Absolute Data Sovereignty
1. The Principle of Sovereign Intelligence
Oura-Casa is architected from the ground up as a sovereign intelligence environment. We operate on the absolute principle that your cognitive processing, emotional signals, and personal integrations remain entirely your own. We do not monetize your data, we do not read your journals, and we do not use your private processing to train foundational AI models.
2. Information We Collect
To facilitate your access to the vault, we collect only the strictest minimum of operational data:
- Identity Infrastructure: When you authenticate via Google or standard email, we receive your email address, profile name, and a unique identifier (UID). This UID is used exclusively to mathematically bind your data to you.
- Encrypted Vault Data: If you utilize our Cloud Synchronization, your chat history, emotional signals, and extraction chains are stored in a secure Firestore database. This data is strictly sandboxed.
- API Credentials: If you provide a personal API key (e.g., Google Gemini, OpenRouter) to bypass standard limits, this key is heavily encrypted before being stored in your cloud profile and is only decrypted at runtime to execute your specific requests.
- Telegram Telemetry (Opt-in): If you choose to configure Telemetry Sync to your Telegram account, we securely bind your Telegram `chat_id` inside your local browser storage. We use this exclusively to securely dispatch performance logs directly to you.
- Anonymous Analytics: We may collect non-identifiable error logs (e.g., crash reports) purely to maintain the architectural stability of the ecosystem.
3. How Your Data is Processed
Oura-Casa functions as a hybrid ecosystem, allowing you to choose how your data is handled:
- Absolute Offline Mode (WebLLM): If you toggle "Offline Engine", Oura Casa will download a quantized 8B language model directly to your device's cache. In this mode, zero bytes of conversational data leave your local GPU memory. The engine runs entirely air-gapped from the internet.
- Local-First Mode: When operating locally without cloud sync, your data never leaves the physical memory of your local device. The application reads and writes solely to your browser's IndexedDB.
- Cloud Processing: When interacting with the AI engine via Cloud Mode, your raw input is temporarily transmitted to third-party language models (like Google Gemini) purely for real-time inference. Google does not use API inputs to train their models. Once the synthesis is returned, the transaction is closed.
4. Third-Party Subprocessors
We rely on enterprise-grade infrastructure to secure your vault. We share data only to the extent necessary to keep the system online:
- Google Firebase: Utilized for secure user authentication and encrypted database hosting. Protected by strict Firebase Security Rules ensuring
request.auth.uid == resource.data.userIdat the database level. - AI Inference Providers: Your prompts are processed via APIs like Google Vertex AI or OpenRouter. These providers are legally bound to zero-retention policies for API queries.
5. Security & Encryption
All data transmitted between your device and the cloud vault is encrypted in transit using industry-standard TLS. Data at rest is encrypted by our infrastructure providers (Google Cloud). We employ strict access control logic at the database layer, ensuring that it is mathematically impossible for anyone without your authentication token to read or write to your vault.
6. Absolute Ownership and Erasure
You hold absolute ownership of your data. At any point, you can navigate to your settings and initiate a Total Vault Wipe. This function instantly and permanently deletes your chat logs, signals, custom protocols, and API keys from our databases. There are no soft-deletes, no 30-day retention periods, and no hidden backups. Your exit from the archive is as clean as your entry.
For inquiries regarding data architecture or privacy concerns, contact the system architect at ksrustylol@gmail.com.