Privacy & Security
Protecting your files from AI scanning and unauthorized access is the foundation of Choco Bin. We're transparent about what we do and don't do with your data.
For a detailed explanation of how Choco Bin's encryption and backup systems work, see our How It Works page.
🚫 What We Never Collect
Choco Bin doesn't collect the following:
- Your file contents or metadata
- IP addresses or tracking cookies
- Location data or device identifiers
- Usage analytics or behavioral data
- Information to sell or share with third parties
- Anything beyond what's necessary to run the app
📝 Zero-Logging Policy
Choco Bin has no servers. All encryption and data processing happens on your Mac.
- No user data passes through our infrastructure
- No logs about your files or usage
- No data shared with third parties
- We cannot see your files. All processing happens on your device.
🛡️ Your Rights & Control
You remain in control of your data:
- Delete your account and data anytime
- Export your files in decrypted form
- Stop using Choco Bin and your files stay in your clouds
- No data is kept longer than needed for the app to function
Why AI Scanning Matters
Cloud providers can scan your files to train AI models. Your documents, photos, and personal data may already be feeding AI systems without your meaningful consent. Here's why this is a problem:
- You don't own your data: Most cloud provider terms allow them to analyze your files for "product improvement," which includes AI training.
- Privacy becomes one-sided: Your files can be scanned to train AI, but you have no visibility into how your data is used.
- Competitive disadvantage: Your business strategies, designs, or customer data might inform AI systems that competitors use.
- No consent is meaningful consent: Buried terms of service don't constitute real consent.
Choco Bin solves this by making AI scanning technically impossible. Your files are encrypted before they leave your device. Providers can't read them, analyze them, or train AI on them. Privacy is guaranteed by cryptography, not policy.
Real-World Examples: AI Integrated Into Cloud Storage
Major cloud providers are embedding AI tools directly into their services, making it impossible to opt out:
Microsoft OneDrive + Copilot
- What it does: Copilot can analyze the content of your OneDrive files and answer complex questions about them [1]
- The lock-in: Users accessing OneDrive also get Copilot analyzing their files. You can't use the storage without the AI access [2]
- Microsoft's claim: They say uploaded files aren't used to train their models
- The problem: Even with privacy promises, the AI is still analyzing your files. When storage and AI are owned by the same company, users have no meaningful choice to opt out
Sources:
[1] Microsoft Support: Get started with Copilot in OneDrive
[2] The Register: Microsoft sets Copilot agents loose on your OneDrive files
Google Drive + Gemini
- What it does: Gemini can analyze files stored in Google Drive [3]
- Google's claim: They say they don't use your data to train models without permission
- The reality: TechRadar reported that Gemini has been caught scanning Google Drive files without explicit user permission [4]
- The problem: Even with official policies, the AI is analyzing your files. You can't prevent it from happening once files are uploaded
Sources:
[3] Google Drive Help: Get insights about your files with Gemini
[4] TechRadar: Gemini AI platform accused of scanning Google Drive files without user permission
The pattern is clear: Cloud providers are integrating AI analysis directly into storage services. You cannot opt out without abandoning the cloud service entirely. This is why encryption at the point of upload is the only reliable solution. Choco Bin ensures your files are encrypted before any cloud provider or their AI tools ever see them.