Privacy Policy

ManyLLM is built with privacy as a fundamental principle, not an afterthought.

Local-First
Zero-Cloud by Default
No Telemetry
Our Privacy Principles
The foundation of how ManyLLM handles your data.

Local Processing

All AI processing happens on your device. Your conversations, files, and models never leave your computer unless you explicitly configure external integrations.

No External Servers

ManyLLM doesn't connect to our servers for AI processing, analytics, or data collection. Your data stays with you.

Encrypted Storage

Local data is encrypted at rest using industry-standard encryption. Your conversations and files are protected even on your own device.

Opt-In Connectivity

Any network connections (updates, model downloads) require explicit user consent. You control when and how ManyLLM connects to the internet.

What We Don't Collect
ManyLLM is designed to minimize data collection.

We DO NOT collect:

  • • Your conversations or chat history
  • • Files you add to workspaces
  • • Model outputs or responses
  • • Usage analytics or telemetry
  • • Personal information or identifiers
  • • Device information or system specs

Optional data (with consent):

  • • Crash reports (only if you opt-in to help improve stability)
  • • Update checks (can be disabled in settings)
  • • Anonymous usage statistics (opt-in only, aggregated)
Local Data Storage
How your data is stored and managed on your device.

Storage Location

macOS: ~/Library/Application Support/ManyLLM/
Windows: %APPDATA%/ManyLLM/
Linux: ~/.local/share/ManyLLM/

What's Stored Locally

  • • Conversation history and workspaces
  • • Model configurations and settings
  • • File embeddings and context data
  • • Application preferences

Data Portability

All your data is stored in open formats. You can export, backup, or migrate your data at any time. ManyLLM includes built-in export tools for conversations and workspaces.

Third-Party Integrations
How ManyLLM handles external services and integrations.

Model Providers

ManyLLM integrates with local model runtimes (Ollama, llama.cpp, MLX). These run entirely on your device and don't transmit data externally.

Optional Cloud Services

If you choose to configure external APIs (OpenAI, Anthropic, etc.), data will be sent to those services according to their privacy policies. This is entirely optional and disabled by default.

Important

When using external APIs, your data is subject to those providers' privacy policies. ManyLLM will clearly indicate when data might leave your device.

Updates and Changes
How we handle privacy policy updates.

This privacy policy may be updated to reflect changes in ManyLLM's features or legal requirements. We will notify users of significant changes through the application.

Version History

  • • v1.0 (2024): Initial privacy policy for ManyLLM launch
Questions or Concerns
How to reach us about privacy matters.

If you have questions about this privacy policy or ManyLLM's data practices, please contact us:

Email: privacy@manyllm.com
GitHub: github.com/manyllm/manyllm

Last updated: December 2024

ManyLLM - Local-first AI, privacy by design.