Your Files, Your AI, Your Machine: Why Local LLMs Change Everything

7 min read Privacy & AI Local AI

Every time you type a search query into a cloud AI service, that query becomes someone else's data. Your file names. Your folder structures. Your work patterns. We decided that's not acceptable.

When we set out to add AI-powered search to FileFortress, we had a choice: take the easy route and pipe everything through a cloud API, or do the hard thing and make it work locally. We chose local.

The Problem with Cloud AI and Your Files

Think about what happens when you use a cloud-based AI service to search your files. Your query — "find the contract PDFs from the Johnson deal" — gets sent to a remote server. That server now knows you have a Johnson deal. It knows you store contracts as PDFs. Over time, these queries paint a detailed picture of your file library, your work, and your life.

For a tool built on privacy-first principles, this is a non-starter. FileFortress already ensures your file metadata never leaves your device. Why would we break that promise the moment you use AI?

The Hidden Cost

Most AI-powered search tools send your queries to external servers. Even if they don't store them permanently, your file names and search patterns are transmitted, processed, and potentially logged by third parties.

AI That Runs on Your Machine

FileFortress AI search uses Microsoft Foundry Local as its default provider. Foundry Local runs language models — like phi-3 — directly on your hardware. No cloud. No API key. No data leaving your device.

Here's what happens when you run an AI search:

  1. You type a natural language query like "find large videos from Google Drive."
  2. FileFortress sends your query to a model running on localhost.
  3. The model returns structured JSON filters (media type, size, remote, date, etc.).
  4. FileFortress applies those filters to your local encrypted index.
  5. Results appear. At no point did anything leave your machine.
# Your first AI search — completely local
filefortress ai "find large videos from Google Drive" --dry-run

# See what the AI interpreted before executing
filefortress ai "find large videos from Google Drive" --dry-run --explain

# Satisfied? Run it for real
filefortress ai "find large videos from Google Drive"
Auto-Detection

If Microsoft Foundry Local is installed on your machine, FileFortress detects it automatically. Zero configuration required — just start searching.

What "Local AI" Actually Means in Practice

Running AI locally isn't just a privacy checkbox. It changes the entire experience:

No API keys to manage. With Foundry Local as the default, you don't need to sign up for anything, manage billing, or rotate keys. Install Foundry, and FileFortress handles the rest.

Works offline. Once the model is downloaded, AI search works without an internet connection. Your local index is already on your machine — the AI model is too. Search your files on a plane, in a cabin, or anywhere without connectivity.

Your query history stays local. Cloud AI providers can log your queries for training, analytics, or compliance. With local models, your search history exists only on your device. Delete it, and it's gone.

Your search patterns are private. Over time, search queries reveal what you work on, when you work, and what matters to you. With local AI, those patterns belong to you alone.

When You Need More Power

Local models are impressive — and getting better fast — but sometimes you need the raw capability of a cloud-hosted model. FileFortress supports that too:

  • OpenAI — GPT-4 and later models
  • Azure OpenAI — enterprise-grade, your own Azure tenant
  • OpenRouter — access to dozens of models through one API
  • Ollama — run any open-weight model locally
  • Any OpenAI-compatible endpoint — self-hosted or custom solutions

Even with cloud providers, FileFortress protects you:

What Gets Sent to Cloud Providers

Only your search prompt text (e.g., "find large videos from Google Drive") is sent to the cloud AI provider. Your file names, folder structures, metadata, and search results never leave your device. Use --dry-run to preview exactly what the AI interprets before any search runs.

The Future Is Local

We're betting on local AI because the trajectory is clear. Models that required data center GPUs two years ago now run on a laptop with 8GB of RAM. Phi-3 fits in under 4GB. The gap between cloud and local model quality is shrinking with every release.

FileFortress is built for this future:

  • More local providers are coming beyond Foundry and Ollama
  • The OpenAI-compatible abstraction means any new local runtime can be plugged in
  • As local models improve, so does FileFortress AI search — automatically

We believe the best AI for file management is the AI that never phones home. Your files are personal. Your search queries are personal. Your AI should be personal too.

Try It Now

If you have Microsoft Foundry Local installed, AI search is ready to go. Start with a safe dry-run:

# Preview what the AI interprets — nothing executes
filefortress ai "show me my largest files" --dry-run

# See the AI's reasoning
filefortress ai "show me my largest files" --dry-run --explain

# Execute with confidence
filefortress ai "show me my largest files"

Where to Go Next

Your AI, Your Machine

AI-powered file search that respects your privacy. Install Microsoft Foundry Local and start searching in plain English — nothing leaves your device.